SBN

Are You Leaving Your Machines Naked and Afraid?

Are You Leaving Your Machines Naked and Afraid?
kdobieski
Tue, 01/22/2019 – 12:44

Many times, when I ask about DevOps, the operational or security teams (even at C level) are not actually aware of what is happening in the business units. DevOps is sort of like the secret-men-and-women of the organization, and nobody knows quite what those folks do except that they produce great content for the business.

To help them understand the full scope of their machine identities, I like to share a real-life example that makes it more relatable to what they are experiencing. My illustration may go something like this:

Imagine you’re anticipating big results from application that you just launched, let’s say it’s a new insurance app that tracks people’s health data and is very exciting for the business to show thought leadership. On the surface, this new app is really beautiful and the execs are really excited about it. So, you take it up to the CEO, and he takes it to the board, and then everybody’s really excited about this great program that’s going to provide a competitive edge. But no one is really looking under the covers. Inside the app you’ve got all these identities you may have inadvertently left naked and afraid.

At this point, almost everyone says, “What do you mean by naked and afraid?” Here’s how I respond:

Well, as you continue to develop and improve your new app, you’re dealing with multiple layers of microservices. Every couple of days you are spinning up new containers as you continue to evolve functionality, tweak usability and fix bugs, etc. And every time you do that, you’re signing all new progress certificates, right? Do you know who’s signing those certificates? Is it your certificate authority of record? Are they self-signed? Who or what are you trusting?

Often, they respond, “Oh, I don’t think we signed it.” And I smile and say “hmmm, so you’re telling me that you’re running an app that collects users’ personal health details and you haven’t signed it?”

Quickly followed by, “Because you are a leading company in your market, of course you have signed it!”

“Did you?”

It’s just not tracked. No one can be sure. It can be stolen, it can expire, it could be noncompliant. It could even be malicious and deliberately placed.

How would anyone know?

This inevitably leads to questions about why they need to sign these apps. To have a little fun, I will create a scenario inside the app where containers are like characters that need to talk to each other. It plays out like this:

  • Bouncer: Where do you think you’re going?
  • Teenager: Uh, into that club.
  • Bouncer: Not until I know you are legal. Where’s your ID?
  • Teenager: Look, I’m old enough, okay?
  • Bouncer: Yeah, but how do I know that?
  • Teenager: Because I told you so, and I was here every night this week and you let me in.
  • Bouncer: But you don’t have anything from an authority I trust to identify yourself. Where’s your ID?
  • Teenager: I left it in the car. Come on man, you know me, I was here same time last night.
  • Bouncer: Oh, right so you were. Yeah, I remember you now. In that case, come on in.

At this point, most executives start laughing. But really, it’s a serious issue. These organizations have their machines out in the wild without trusted identities to protect themselves. Hence the phrase “naked and afraid.” These machines are naked because they are not protected, and they are afraid because they have no way of identifying other machines or users and that leaves them vulnerable.

How would you feel if you were those machines? Imagine you are walking in a dark park at night with minimal lighting. You see strangers coming towards you, but you don’t know who they are. They may be innocent passersby, or they may be criminals. They are wearing masks, after all its Halloween, but they are smiling and seem friendly. Do you trust them? Probably not! Will you stop them and demand ID? Probably not! Chances are, that’s what is happening in your network with machine identities today.

Many organizations just keep creating environments where all the devices are all allowed to talk to each other, and many don’t know whether these devices even have an identity, or not. And they often have no way of checking that identity either. We don’t do that with humans. If a human gets an ID card, it’s validated by somebody in physical security who gives them either a digital or physical card that allows them access. But a lot of established process goes on before that happens. There is an identity management system in place that controls access to gates, doors and secure areas and tracks where people go.

On the other hand, we leave these poor machines shivering in their little shoes because they are being asked to do stuff that they’re not qualified to do. There’s no occupational health and safety for these machines. They’re just sitting in there looking at stuff coming at them going, “Gosh, I don’t know whether this is right or not, because I’ve got no way of checking. The guy who wants to get in has a certificate, but I don’t know who’s supposed to sign it. It’s a valid certificate so without policy loaded into my brain, who am I to argue? I don’t have the right level of information or authority – someone left me naked again. Oh well, better let them through I guess, or the app will go down.”

Time and again, I see that organizations are not understanding how exposed they are by not having centralized management of their machine identities. That’s right down to the microservices and containers, but it’s also back up into the code and the algorithm. It’s up into the actual physical devices and out into the cloud. Some of them have got a running start at addressing this problem with some sort of management on their external SSL/TLS, and when you dive deeper into their organization that strategy begins to fall apart.

An internal certificate, in the wrong hands is every bit as dangerous as an exposed and unmanaged external certificate, but many operations teams do not understand this. They don’t really understand the role certificates play in identifying access to data for everything not human on the network. Nor should they, because in a world-class security strategy, these machine identities would be automated and managed centrally, reducing risk to the business significantly.

Are you leaving your organization’s machines naked and afraid?

Related posts

machine identity security automation
Terrie Anderson

When I visit a large organization to chat about machine identity protection, they usually have a specific goal in mind. Often, they are interested in managing their public TLS, and maybe their Microsoft Certificate Authority (CA). The teams focus on what they know, and often admit they don’t even really want to know what they don’t know: the problem of protecting machine identities is too big and too scary to address. I find this “Ostrich mentality” a strange approach to risk mitigation, but it comes from teams that are heavily laden with the certificate burden already.

The idea of moving to an automated state to protect machine identities with minimal (even zero) touch is very attractive, but it’s the concept of the one-time cleanup of the mess that’s daunting. More often than not, they are not even thinking about the full spectrum of machine identities that they need to protect. So, when I ask them what else they need to manage, I’m often met with a sense of hopelessness. The problem is getting worse every day, and the risk to the business is increasing every day as well.


*** This is a Security Bloggers Network syndicated blog from Rss blog authored by kdobieski. Read the original post at: https://www.venafi.com/blog/are-you-leaving-your-machines-naked-and-afraid