The critical infrastructure we all rely on is still vulnerable to cyberattacks
This isn’t one of those good-news, bad-news stories about cyber vulnerabilities. It’s more like bad news and a little less-bad news.
Indeed, there isn’t much encouraging in a recent report from network monitoring firm SynSaber on vulnerabilities in industrial control systems (ICSs) other than that things could always be worse. And that it’s promising to see both the federal government and the private sector paying more attention to ICSs, which run just about all of the nation’s critical infrastructure — energy, water, transportation, food, healthcare, financial services, and more.
The company’s report analyzed ICS vulnerabilities that the federal Cybersecurity and Infrastructure Security Agency (CISA) flagged during the first half of 2022. There were 681. And of that total, CISA ranked nearly two-thirds of them (65%) either “critical” or “high-severity.”
The less-bad news (and some context) is that exploiting 46 of those vulnerabilities would require both local/physical access and user interaction, while another 198 would require user interaction. In other words, the risks from them are relatively low.
But that still leaves 437 that could be exploited remotely. That can’t be good.
Because as an increasing number of experts are warning, the damage from cyberattacks on ICSs can go beyond data theft or ransomware demands and into the physical world. An ICS is running physical operations, after all — generators, water or sewer systems, traffic lights, factory machinery, etc.
When those operations fail, malfunction, or are taken over by malicious attackers, the damage can be physical. As in, things can get damaged or destroyed, and people can get hurt or killed.
Software risks are business, and safety, risks
It has long been established that software risks (and firmware and hardware risks) are business risks. They’re also health and safety risks.
And more of that is what’s being predicted. ZDNet reported recently that one of the forecasts from analyst firm Gartner for the rest of 2022 is that hackers will “weaponize” operational technology environments to cause human casualties. Gartner noted that attacks on what is “often the brains behind industrial systems in factories or power grids,” have already become more common and more disruptive.
And a report earlier this year from the Norwegian firm DNV titled “The Cyber Priority,” found that 85% of 940 professionals working in the power, renewables, and oil and gas sectors believe cyberattacks in those industries are likely to cause operational shutdowns and damage to energy assets and critical infrastructure in the next two years. More than half — 57% — expect them to cause loss of life.
So far, instances of that kind of damage from ICS attacks are rare — at least officially. The BBC reported earlier this month about multiple cyberattacks on June 27 against a steelmaker in Iran that caused a major fire. But according to CNN, those attacks were outliers — among very few to cause physical damage in more than a decade, since the infamous Stuxnet attack on the Iranian nuclear program in 2010.
That attack, attributed to the U.S. and Israel, destroyed nearly 1,000 uranium enrichment centrifuges by tricking the system monitoring them to report that they were operating normally when in fact they were spinning out of control.
The hacking group Predatory Sparrow claimed responsibility for the recent attack, which it said in a post on the messaging app Telegram was in retaliation for “aggression” by the Islamic Republic. But the group also said it had carried out the attacks “carefully to protect innocent individuals.”
Whatever the measures allegedly taken to avoid violations of international law, Cyber Policy Journal Editor Emily Taylor told the BBC that, “If this does turn out to be a state-sponsored cyberattack causing physical — or in the war studies jargon ‘kinetic’ — damage, this could be hugely significant.”
The danger isn’t theoretical
Perhaps significant, definitely ominous, but not a major game changer according to Joe Weiss, managing partner at Applied Control Solutions and an ICS expert. Weiss wrote in a blog post responding to the BBC story that he thinks it highly likely that kinetic attacks on ICSs are not rare at all. It’s just that there is no good way to tell if an incident is accidental or malicious unless an attacker claims credit, as Predatory Sparrow did.
He suspects there are many more malicious attacks on ICSs that are not officially classified as such because control systems don’t have the capability built into them for forensic analysis on the cause of an incident.
Indeed, Weiss has compiled a nonpublic database that he says includes a long list of kinetic cyber incidents on ICSs. They include “massive environmental spills, pipeline ruptures, forced electric outages, fires in datacenters, ships being sent off-course, chiller motors being damaged, power plant turbines destroyed, tilting offshore oil platforms, steel mill furnace and oxygen systems damaged, and other(s).”
He contends that much of the problem with a lack of ICS security is that there too much focus on network security and not nearly enough on control systems.
“IT network cyberattacks do not damage equipment such as turbines, transformers, pumps, motors, valves, etc.,” he wrote in another blog post a year ago.
“For control systems, it is the opposite. Usually when a control system is impacted, the effects can’t be hidden — a pipe breaks, a train crashes, the lights go out, sewage is discharged, etc. Instead, the challenges are identifying if cyber electronic communications played a role [which would] distinguish an attack from an accident.”
“Yet networks are the government’s focus,” he said.
Michael Fabian, principal security consultant with the Synopsys Software Integrity Group, said it’s not that simple. He agrees that attention should be focused on both the cybersecurity of operational technology (OT) and information technology (IT).
But he said it’s not as though there is one kind of attack against IT or network and another type on OT. “While attacking every system requires a degree of specificity, a lot of these attacks can be similar outside of niche cases,” he said. “Often you don’t need to do any esoteric attacks to achieve some sort of disruption. A look at a lot of the released vulnerabilities reveal common implementation and design errors we’ve been combatting since the Rainbow Series.”
“Cyber is complex, difficult, and can be expensive,” he added. “In some cases the business systems, which have higher interaction with enterprise/business users, suffer an attack, and that cascading system-level effect leads to disruption of production.”
Neither CISA nor SynSaber responded to a request for comment about physical damage from cyberattacks that have exploited ICS vulnerabilities.
Patching is complicated
But, as numerous ICS experts have said in the past, this is a problem that doesn’t have an easy fix. One major reason is that applying software or firmware updates to critical infrastructure is nothing like installing the updates for apps on your smartphone that show up during the night and just take a tap of your finger to install.
As Jonathan Knudsen, head of global research within the Synopsys Cybersecurity Research Center, put it, “Patching software is fundamentally at odds with the always-on, don’t-touch-it-when-it’s-working nature of critical infrastructure. Critical infrastructure prioritizes availability, and introducing software updates poses a threat to that availability.”
Indeed, while one of the major mantras in software security is to keep software up-to-date, doing that with complex control systems carries its own risks. “Every time you apply a patch, there is a risk that things will stop working,” Knudsen said. “Patches should be rigorously evaluated in a test environment before being applied to live systems. Even so, no test environment will completely simulate reality, so patching is always risky.”
That is one of the points SynSaber makes as well. “Organizations must determine the operational risk and follow internal configuration management policies and procedures,” before applying an update, the report said.
Fabian agrees. “Safety and reliability testing are critical to maintaining operation of those systems and any software changes need to be validated in that environment.
“For example, most operators cannot just freely patch a turbine control system on their own and still guarantee its safe operation unless those remediations have been validated through testing.”
“With every system potentially being slightly different, it’s a massive engineering operation to ensure those fixes are appropriate for each system.”
Beyond that, even if there is a software or firmware patch available and it won’t gum up the operation of a system, that doesn’t mean an organization can immediately apply it. “Asset owners are still required to work with the affected original equipment manufacturer (OEM) or vendor and wait for official approval to patch (…) due to complicated interoperability and warranty constraints that apply to industrial control systems,” according to the SynSaber report.
So yes, it’s complicated and often requires more than applying an update, even if a vendor validates it. “Managing risk in critical infrastructure is a balancing act between the risk of attack and the risk of breakage from patching.,” Knudsen said.
Promising trends
But amid the bad news, there are some promising trends. The SynSaber report noted that OEMs themselves reported a majority (56%) of the vulnerabilities flagged by CISA for the first half of this year.
“That’s a good thing,” Fabian said. “It means OEMs are going back through their technical backlog, identifying issues, and pushing remediation.”
And the better news for users is that even if they can’t or shouldn’t patch, they aren’t helpless. “A guiding principle is defense-in-depth,” Knudsen said. “Suppose you have a piece of software with a vulnerability, but exploiting it requires physical access. Put the computer in a locked room, in a building that requires keycard access, in a facility that has a fence around it, with video surveillance and guards. Make the attacker’s job harder.”