Cybersecurity Lessons from the Election: Human Behavior - Security Boulevard

Cybersecurity Lessons from the Election: Human Behavior

There have been a number of recent articles in the popular press suggesting that behavioral science can serve to explain people’s responses to COVID-19 and indicate how individuals might be persuaded to act in line with the common good. This concept is examined in two recent articles, one scholarly, providing an extensive review of the field,[i] and the other, in the popular press, is more specific in its examination of vaccine prioritization.[ii]

I was surprised to see that none other than the popular Dr. Anthony Fauci, who has served as the director of the National Institute of Allergy and Infectious Diseases since 1984, has been raising his estimate of the percentage of the population needed for herd immunity to the novel coronavirus in order to “nudge” individuals to agree to be vaccinated.[iii] This is certainly an example of behavioral science at work, although it is not clear how effective it has been or will be since the impact of the nudges has not been subjected to scientific experimentation or analysis. Still, it is interesting that the nudging process is being used this way.

In my Ph.D. dissertation and subsequent book, “Computer Effectiveness: Bridging the Management-Technology Gap” (Information Resources Press, 1979), I suggested that a “central control” provide users with costs and job priorities that would nudge them to alter their job processing requirements so as to optimize the total work done rather than optimizing locally, i.e., only for themselves. This presumes that individuals will typically operate in their own self-interest rather than for the benefit of all, which is often the case, and that they need to be encouraged to respond in such a way as to optimize globally.

This type of control, using incentives and feedback, would benefit the public response to officials’ requests for altering behavior, such as wearing masks, physical distancing, and so on. One can use a carrot-or-stick approach depending on the prevalent culture of the country. The idea is to encourage or force individuals to act in the public interest. From observation, it is apparent that the “stick” approach appears to be effective in authoritarian societies, but the “carrot” approach does not seem to have as much impact in more liberal countries. One might argue that, for the latter, there is no linkage between, say, distributing financial assistance and requiring conformance with public-health guidelines. Without getting into the politics, it would seem that providing specific incentives and disincentives would be more effective than current influencing approaches.

So, what does this mean for cybersecurity? The SolarWinds cyber catastrophe is a timely example of the failure of the assignment of responsibility and implementation of measures to encourage effective preventative behavior, Essentially what we are saying is that, while the SolarWinds corporation’s shareholders, management and employees are taking a beating, the victims of the attack—both government agencies and corporations—are likely to claim that they should not be blamed for applying a software update containing potentially very damaging malware since at least 18,000 customers of SolarWinds did the same. This “tragedy of the commons” situation (which we have discussed a number of times in this column) is such that no one, in either the public or private sectors, is stepping up to what turns out to be a very common and dangerous issue—namely, how one ensures that software supply chains satisfy acceptable security-assurance standards. Of course, before we can do that, we need to have explicit standards (which we do not have) and the means to enforce them (which we don’t have either).

Behavioral economics describes how one might encourage or “nudge” individuals and organizations to act in ways that benefit individuals, organizations, and society overall. In order to achieve global optima, one cannot guarantee that each and every individual will be better off. However, that is not necessarily a requirement. In order to arrive at a so-called Pareto optimum, where there is a balance between those who are worse off and those who are better off, we need the flexibility of using disincentives as well as incentives.

It’s about time that nations stepped up to their responsibility to protect the worldwide Internet from intentional and accidental attacks that would take down today’s most valuable resource and that they came up with actionable and effective means of enforcement. Until that happens, the situation remains very precarious for everyone.


[i] Jay J. Van Beval et al, “Using Social and Behavioural Science to Support COVID-19 Pandemic Response, Nature Human Behaviour, Volume 4, May 2020. Available at https://doi.org/10.1038/s41562-020-0884-z

[ii] Siobhan Roberts, “The Pandemic Is a Prisoner’s Dilemma,” The New York Times, December 20, 2020. Available at ‘The Pandemic Is a Prisoner’s Dilemma Game’ – The New York Times (nytimes.com)

[iii] Donald G. McNeil, Jr., “How Much Herd Immunity Is Enough?” The New York Times, December 24, 2020. Available at How much herd immunity is enough? – The New York Times (nytimes.com)

*** This is a Security Bloggers Network syndicated blog from BlogInfoSec.com authored by C. Warren Axelrod. Read the original post at: https://www.bloginfosec.com/2021/01/18/cybersecurity-lessons-from-the-election-human-behavior/?utm_source=rss&utm_medium=rss&utm_campaign=cybersecurity-lessons-from-the-election-human-behavior