SBN

Cybersecurity Lessons from the Pandemic: Perception of Risk

The more “mature” among us may recall when decision-making under uncertainty was based on the concept of “rational economic man.” We estimated or calculated the probability and amount of a loss (or gain) of various courses of action, multiplied the numbers together to arrive at a range of expected losses (and gains) and selected the supposedly most advantageous option. We also resolved differences between current and future gains and expenses using formal net present value and internal rate of return methods to bring future cash flows back to the present day, allowing for the ready comparison of various investments.

Back then, that was the thing to do despite issues about the accuracy of the cash flow estimates, the assignment of probabilities, and the appropriateness of the rates of return.

Then the field of behavioral economics burst onto the scene in the 1970s and 1980s and prior methods were thrown into disarray. No longer did rational economic man exist as such. Real-world decisions came from irrational decision makers who relied on perceptions rather than algorithms when making decisions in risky circumstances. Initial arguments between the two schools of thought were (and sometimes still are) somewhat contentious, but it appears that some of that hostility has died down, especially as a number of the pioneers of behavioral economics, such as Daniel Kahneman and Richard Thaler, have been recognized with Nobel prizes.

When we look at responses to the COVID-19 pandemic, we see a whole range of divergent opinions leading to very different policies and decisions. Why is that? Paul Slovic has worked extensively on risk perception, as in his paper “Perception of Risk Posed by Extreme Events,” co-authored with Elke U. Weber, and presented at an April 2002 conference, which may be downloaded from  https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2293086  Slovic also edited an entire book devoted to “The Perception of Risk” (Earthscan, 2000).

More recently, Robert Meyer and Howard Kunreuther published “The Ostrich Paradox: Why We Underprepare for Disasters” (Wharton School Press, 2017), which is particularly applicable to today’s many catastrophic events, including the pandemic, climate change, wildfires, and high winds, lofty surges and heavy rainfall from hurricanes. In their book, Meyer and Kunreuther claim that preparedness errors “…can be traced to the harmful effects of six systematic biases …” namely:

Myopia bias—a tendency to focus on overly short future time horizons …

Amnesia bias—a tendency to forget too quickly the lessons of past disasters

Optimism bias—a tendency to underestimate the likelihood that losses will occur from future hazards

Inertia bias—a tendency to maintain the status quo or adopt a default option

Simplification bias—a tendency to selectively attend to only a subset of the relevant factors

Herding bias—a tendency to base choices on the observed actions of others

The authors also sprinkle a number of other biases throughout the book, for example:

Availability bias—the tendency to estimate the likelihood of a specific event occurring on the basis of one’s own experience

Compounding bias—the tendency to focus on the low probability of an adverse event in the immediate future rather than on the relatively higher probability over a longer time period

Anchoring bias—the tendency to be overly influenced by short-term considerations that come easily to mind

The main recommendation of the book is to improve preparedness by conducting a “behavioral risk audit,” which comprises four steps:

  1. List the six biases
  2. Describe the impact of each bias on beliefs leading to underestimation of risks
  3. Analyze how misbeliefs about risk can degrade preparation measures
  4. Design incentives and persuasive tactics to overcome preparedness errors

The authors then examine a series of cases and suggest remedies.

If you go through the above biases and relate them to opinions and decisions relating to the COVID-19 pandemic, the rationale behind the opinions and decisions, which we are witnessing, aligns closely with the biases.

Unfortunately, we see the same biases and lack of preparedness when we look at cybersecurity risk. If we were to be unlucky enough to experience a major cyberattack or cyber-related failure, there is little to no evidence that effective plans are in place to deal with such situations. There is a strong belief among the general nontechnical populace that technology will save the day and that the geeks will be able to resolve any situation. We have experienced near misses in the past and these have served to give us confidence that we’ll get through any future cyber situations. But this view is fraught with the biases that we always bring to the table. So, what are the unbiased scenarios and available means to deal with them? I shudder to think what they might be—or not be—respectively. It is time to conduct a behavioral risk audit of cyberspace and come up with some real protection and mitigation plans—and then to find the resolve to prepare for potential catastrophic cyber events.


*** This is a Security Bloggers Network syndicated blog from BlogInfoSec.com authored by C. Warren Axelrod. Read the original post at: https://www.bloginfosec.com/2020/09/28/cybersecurity-lessons-from-the-pandemic-perception-of-risk/?utm_source=rss&utm_medium=rss&utm_campaign=cybersecurity-lessons-from-the-pandemic-perception-of-risk