Why the airplane analogy doesn’t fly.

Don’t get me wrong — I love Trey Ford. He is one of the most inspiring infosec pros I know. He’s smart, creative, full of mind-blowing ideas, and has energy to spare. And I love his talk at SecTor about what we can learn about information sharing from the aviation industry.

There’s just one problem: aviation isn’t all that comparable to cybersecurity.

Imagine that instead of flying the plane herself, a pilot had to convince all the passengers on the flight, EVERY flight, to do the flying together. And many of them aren’t good at it, and don’t care; they just want to sleep or watch videos or whatever.

The passengers change all the time, so you can’t keep them educated on what to do. Depending on the size of the plane, there may be tens or hundreds of thousands of passengers helping with the flying. Instead of a finite maintenance crew that’s under the direct control of the airline, there are dozens or thousands of different crews from third-party companies, all doing their bits (or not).

The aircraft types range into the thousands, dating back to Kitty Hawk and up to the newest models, and most of them have at least some custom alterations that can be changed between flights, so the various manufacturers won’t take responsibility for anything they didn’t add. Remember, too, that each of those alterations was probably made for a good reason — or at least, a reason that was good at the time. (That’s a huge part of what we don’t know about breaches today: we sometimes know the chain of events and contributing vulnerabilities, but we make rash judgments about why they happened without knowing the full story.)

The airlines all have different ideas on how they should equip their planes, so some pilots have one of everything new and shiny, and others have to make do with duct tape and bags of pretzels. (And some airlines are just now thinking that maybe having a dedicated pilot is a good idea.)

Oh, and did I mention? The weather is actively trying to disrupt your flight, usually in a way that you won’t notice until it’s too late. (Although you still have to worry about hacktivist storm cells that want you to look bad.)

All of these differences highlight our challenge in security: because everything is so complicated, so flexible, and so NOT under our individual control, we can easily blame someone else for their breaches because they did things so differently. And the pilot is nominally in charge, so that’s where we concentrate the attention, but even the pilot can’t get the toddler in 32B to stop screaming and fly straight. I’m not even going to mention the armchair aviation enthusiasts who sit near the runway with binoculars and lasers and provide “helpful” critique. (Oops, I guess that slipped out.)

So how can we still make use of what we’ve learned from information sharing in aviation? As Trey says, we can at least collect data now in a way that we may be able and willing to share later. If only we had a black box that collected vital information about a breach in a way that didn’t expose the inner workings of the business, or those custom-built additions. If only we could sanitize the data in a way that communicated the important lessons (“don’t combine these tray tables with that boarding process, and especially don’t add a pilot over 6 feet tall without upgrading the landing gear”) but defanged our industry’s reflexive attempts at a certain kind of blame (“how stupid was that? We’d never do that!”).

When I consider all this, sometimes I despair that we’ll ever figure it out. But with positive thinkers like Trey, we may just have a chance.

*** This is a Security Bloggers Network syndicated blog from Idoneous Security authored by Wendy Nather. Read the original post at:

Cloud Workload Resilience PulseMeter

Step 1 of 8

How do you define cloud resiliency for cloud workloads? (Select 3)(Required)