Despite ongoing warnings, U.S. critical infrastructure remains vulnerable

The state of ICS security (which protects the industrial control systems that power our critical infrastructure) is worrying. How can we start to improve it?

ICS security lacking, U.S. critical infrastructure vulnerable

The original version of this post was published in Forbes.

Critical infrastructure operates so much in the back of our minds that it often doesn’t seem that critical.

We don’t hold our breath when we flip the light switch, hoping it will work—we don’t even think about it. Of course the light goes on. Just like the heat goes up when we tweak the thermostat, the food is frozen when we take it out of the freezer and the water is pure when it comes out of the faucet.

Yes, we know subconsciously those things aren’t guaranteed—it’s just that disruptions are so rare that it stays there, in the backs of our minds.

But, yet another report on the cyber vulnerabilities of critical infrastructure suggests it ought to be a bit more up front.

This one, from the Russia-based cybersecurity firm Kaspersky, found that 42.7% of the industrial control system (ICS) computers it protected last year were attacked by malware, email phishing or other threats. The U.S. and the rest of North America were among nations ranked in the middle for the number of malicious objects detected.

The report didn’t say how many U.S. firms are Kaspersky clients. But Kirill Kruglov, senior research developer at Kaspersky Lab ICS CERT (Computer Emergency Readiness Team), said the U.S. was “the 47th most attacked country observed by Kaspersky Lab in 2018, with malicious objects detected on 20.9% of industrial automation systems.”

The company reported that in 2018 it detected close to 40,000 pieces of malware belonging to 2,700 families, and said that “in each month the proportion of ICS computers on which malicious activity was prevented was higher than that in the same month of 2017.”

Good news, bad news for ICS security

None of that should be a surprise—in fact it could be seen as encouraging that more than half of the ICS computers covered in the report weren’t attacked.

But more ominous is that in 2018, Kaspersky “identified 61 vulnerabilities in industrial and IIoT/IoT systems,” but the owners of those systems fixed only 29 of them during the year. Also, it found that 20% of vulnerable ICS devices had vulnerabilities ranked as “critical.”

Owners fixed only 29 of 61 vulnerabilities in ICS and IIoT/IoT systems last year.

The cliché in cybersecurity is that defenders have to be right 100% of the time, while an attacker only needs to be right once. This, just in one company’s sampling, means attackers have dozens of chances.

Not that this should cause national panic. Most cybersecurity experts say U.S critical infrastructure—energy, transportation, water, sewer, food and agriculture, health care, communications—16 in all, according to the Department of Homeland Security (DHS)—is diverse and robust enough to withstand even sophisticated nation-state attacks.

A lengthy national or even regional collapse of critical infrastructure—the kind envisioned in multiple past warnings of a “cyber Pearl Harbor” or in former Nightline anchor Ted Koppel’s 2015 book Lights Out—remains hyperbole, they say.

But it also doesn’t mean significant disruption couldn’t happen on a smaller scale.

“Accidents” highlight the potential scope of incidents

Last September a series of seemingly random natural gas fires and explosions upended the lives of thousands of residents in three communities in the Merrimack Valley of Massachusetts. Dozens lost their homes entirely. Thousands more, without gas or heat for months, had to move to motels and trailers.

Natural gas explosions and fires can affect other infrastructure as well.

Now, more than six months later, supplier Columbia Gas reports that the cost of the disaster has topped $1 billion and is still rising. Keep in mind, this was in just three communities in a small corner of a small state.

An investigation concluded the cause was human error—a failure to transfer a regulator pressure sensor during a pipe replacement project, which led to catastrophic excess pressure in gas lines.

But, as numerous experts said at the time, a cyberattacker who was able to take control of the system and “fool” pressure sensors would achieve similar results.

“SCADA [supervisory control and data acquisition] systems, water, the grid, natural gas pipelines—they all use sensors,” said Bob Noel, vice president of strategic partnerships at Plixer.

So while the Merrimack Valley incident was an accident, it illustrated what malicious control of sensors could do. “Imagine if you had access to the New York City water system,” he said. “You could blow every toilet off the floor and make every sink explode.”

Image what a cyberattacker who took control of the sewer system and “fooled” pressure sensors would be able to do.

The possibility of political motivation

Most attackers don’t have that kind of an “act of war” motive. Kaspersky’s Kruglov described most attacks as “generic—crypto miners, password stealers, rogue software and ransomware or botnet agents—and not related to espionage, blackmail, political pressure or physical destruction/acts of war.”

But he added that there is “a small percentage of threats related to espionage.”

Michael Fabian, principal consultant at Synopsys, agrees that it is a combination. “More than likely it’s either completely random—getting a run-of-the-mill virus on an ICS box—or somewhat targeted to a specific company for espionage. Many of the things we hear now are ransomware,” he said.

But he said it is likely that hostile nation-states are involved for political reasons even if it doesn’t get reported.

“I can say with a level of confidence that it is happening between many different parties for a myriad of reasons,” he said.

Noel said while the risk of an attack that cripples a major portion of the country is remote, he believes the threats posed by nation-states that could cause  physical damage “are very real and very high.”

The problem with patching industrial control systems

So, one obvious question is why ICS owners aren’t being much more aggressive about patching vulnerabilities.

The answer? It’s much more complicated and expensive than downloading a free patch for an app on your phone.

“Getting updates to ICS systems means getting the vendor to install and retest the system to make sure it works OK,” Fabian said. “Some vendors are better at this than others and will typically charge for it.”

Perhaps more significant, Kruglov said, “ICS software of a specific version is guaranteed to work in a specific environment, including the exact OS version and list of patches it’s compatible with, etc., meaning that any change in the predefined configuration could automatically result in the cancellation of guaranteed support for the ICS owner.”

Noel notes other possible complications. “Often patches come out but there are very narrow times available to take the system down. It has to be preplanned—it might be six months from now,” he said. “And in many cases, before you upgrade you have to go through quality assurance. Sometimes a new patch can be incompatible with an older system—it may require a back revision of the OS. Or there may come a time where they’re not backward compatible.”

Improve ICS security with better training

Improve ICS security with better training

Still, there are ways to improve ICS security without the complications of updates. Joe Weiss, managing partner at Applied Control Solutions, has been saying for years that there is a gap in training for those both in engineering and in IT security.

In a recent podcast with Momenta Partners, he said “Cybersecurity is normally taught in computer science, and they don’t require you in most cases to take any engineering classes. Meanwhile, the engineering domains—electrical, mechanical, chemical, systems, nuclear, industrial—don’t really require you to take any cybersecurity training.”

And it is not enough to protect the network in an ICS system, he said. “You can keep lights on even if the network is down, but if the lights are off, the network isn’t going to be on anyway. We’re looking at the wrong things,” he said. “IT can’t kill anybody, but engineering can and has.”

In a blog post last month, Weiss wrote that the 2017 Trisis attack in 2017 on a Saudi petrochemical plant illustrated that training gap.

“The plant engineers had no training to identify unexpected events as possibly being cyber-related,” he wrote. “Consequently, the first time the plant tripped, it was simply considered a malfunction and the plant restarted without cyber security considerations.”

It wasn’t until the plant shut down again, two months later, that the malware was discovered.

And don’t ignore the small-scale concerns

Fabian has said previously that he doesn’t foresee an effort to take down a major part of U.S. infrastructure because even hostile nation-states are wary of U.S. power. “We’re not going to just sit there and take it,” he said, “and I think our [cyber] capabilities are probably more significant than those of others.”

Still, even smaller-scale attacks, if there are enough of them, could start to feel like death by a thousand cuts.

All you have to do is talk to families in the Merrimack Valley.

Learn more about industrial control systems security

*** This is a Security Bloggers Network syndicated blog from Software Integrity Blog authored by Taylor Armerding. Read the original post at: