SBN

Security Without Tears or Apology

In plugging this blog, for which I’m grateful, Avedon Carol mentioned that my subtitle “Security without apology or tears” doesn’t necessarily make immediate sense. I thought I’d spend some time talking about that.

Every time I tell someone I do information security for a living, I get an awestruck, impressed kind of look as if I had told them that Gandalf passed me his staff and asked me to keep an eye on things while he went West. Often, I’m reassured of my intelligence or employability. Frequently, I’m regaled with tales of some security incident that happened at someone’s workplace – almost always because someone made an easily avoided mistake. Commonly, I’m asked in hushed tones about the latest virus or vulnerability, and if they’re safe from it – or what to do about the virus that just infected their computer.

Professionally, very big companies have me inspect millions of dollars worth of security protecting billions of dollars worth of assets. Often they want to know if I can break in. Often the answer is yes, and rarely because I’m such a smart guy. Almost always, it’s because someone didn’t do something – one of those easily avoided mistakes I mentioned. Often, those mistakes happen because whoever’s job it is to not make those mistakes didn’t know what not to do. Or they were too busy not to make the mistake. Or they didn’t think it was really that big a deal.

That’s the tears. It doesn’t have to be like this.

I’ve also been paid to write memos begging CEOs not to insist on passwords so short that they’re the equivalent of 15-th century locks because they didn’t want to have to remember anything longer, and the CISO needed an expert to back him up in saying there was a risk to that. I’ve had proposals to spend fifteen thousand dollars on an anti-virus system turned down one in April only to have it approved in May after a virus causes a hundred thousand dollars in lost productivity because fifty computers had to be cleaned and rebuilt while fifty users sat around getting paid to do nothing. I’ve worked out how to soft-pedal adopting security measures to someone so out of touch with modern technology that he still thought being the IT director meant he “ran the computer” and couldn’t believe there’s any real danger from hackers, virii, or internal users bent on fraud.

That’s the apology. It doesn’t have to be like this, either.

Being secure isn’t any different on a computer or with information than it is in any other part of your life. If lock your door, you do security. If you don’t lock your door because your neighbor might need to get in, you still do security – you’ve assessed your risks and decided which to accept and which to mitigate. But no one decides they need to lock their door and then tapes the key to the knocker. People don’t spend much energy on deciding not to do that, either. On that level, doing security is reflexive, even unconscious.
There’s got to be a way to bridge that gap.

Often, people who sell security use FUD – Fear, Uncertainty, and Doubt – to do it. They do so because in the short term it works, but in the long term it has left people with a wildly distorted notion of what they have to protect and from whom. Worse, when you peddle fear, you get frightened people. Frightened people rarely make good decisions. Time and again, FUD comes back to haunt us.

Fear is no way to live one’s life, with computers or anything else, nor is ignorance. There are risks to using computers just as there are with anything else. Knowing how to identify those risks, deciding how to handle those risks, and doing all that before the risks come leads to a well-founded confidence – just like in any other area of life. This blog is about taking the mystique out of what I do, and putting it into everyone’s reach as simply part of how we live our lives day to day.

*** This is a Security Bloggers Network syndicated blog from Defense Rests authored by Dan Holzman-Tweed. Read the original post at: http://defense-rests.blogspot.com/2010/02/security-without-tears-or-apology.html