The Security Profession Needs to Adopt Just Culture

Posted under: Research and Analysis

Yesterday Twitter revealed they had accidentally stored plain-text passwords in some log files. There was no indication the data was accessed and users were warned to update their passwords. There was no known breach, but Twitter went public anyway, and was excoriated in the press and… on Twitter.

This is a problem for our profession and industry. We get locked into a cycle where any public disclosure of a breach or security mistake results in:

  • People ripping the organization apart on social media without knowing the facts.
  • Vendors issuing press releases claiming their product would have prevented the issue, without knowing the facts.
  • Press articles focusing on the worst case scenario without any sort of risk analysis… or facts.
  • Plenty of voices saying how simple it is to prevent the problem, without any concept of complexity and scale of even simple controls (remember kids, simple doesn’t scale).

To be clear, there are cases where organizations are negligent or try to cover up their errors. If their press release says things like “very sophisticated attack” infosec fairies deservedly lose their wings, but more often than not we focus on blame, not cause. This is true both in public and with internal investigations.

This is a problem many industries have faced, and two in particular have performed extensive research and adopted a concept called Just Culture. It’s time for security to formally adopt Just Culture, including adding it to certifications and training programs.

Aviation and healthcare are two professions/industries that use Just Culture, to different degrees. Since my background and introduction is on the healthcare side, that’s where I’ll draw from.

First, read this paper available through National Institute of Health:

The focus in Just Culture is to identify and correct the systemic cause, not to blame the individual. Here are some choice quotes:

People make errors. Errors can cause accidents. In healthcare, errors and accidents result in morbidity and adverse outcomes and sometimes in mortality.

One organizational approach has been to seek out errors and identify the responsible individual. Individual punishment follows. This punitive approach does not solve the problem. People function within systems designed by an organization. An individual may be at fault, but frequently the system is also at fault. Punishing people without changing the system only perpetuates the problem rather than solving it.

A just culture balances the need for an open and honest reporting environment with the end of a quality learning environment and culture. While the organization has a duty and responsibility to employees (and ultimately to patients), all employees are held responsible for the quality of their choices. Just culture requires a change in focus from errors and outcomes to system design and management of the behavioral choices of all employees.

In a just culture, both the organization and its people are held accountable while focusing on risk, systems design, human behavior, and patient safety.

The focus is on the systemic risk first, and the individual… later. This is something we face in healthcare/rescue every day and many errors are the result of the system more than the person. For example, in some prehospital systems it isn’t uncommon to have two medications with vastly different effects packaged in very similar packaging, resulting in medication errors that can be fatal. The answer isn’t better training, but better packaging.

Fix the system, don’t expect perfect behavior.

Let’s apply this to Twitter. Plain text passwords were stored in logs. This is bad, but there are a lot of ways it could have happened. Think of all the levels of logging and software components they likely have and all the places the passwords might have fallen into those logs. With a Just Culture approach we should reward Twitter for their honesty and learn what techniques they used to detect the exposed data, and what allowed the data to be saved in those logs and undiscovered for so long.

What system issues caused the problem and how can we prevent those moving forward? Not “Twitter was stupid and got hacked” (because, they didn’t).

Just Culture is about fostering an open culture of safety where mistakes, even individual mistakes, are used to improve the overall system resilience. It’s our time.

– Rich
(0) Comments
Subscribe to our daily email digest

*** This is a Security Bloggers Network syndicated blog from Securosis Blog authored by [email protected] (Securosis). Read the original post at:

Cloud Workload Resilience PulseMeter

Step 1 of 8

How do you define cloud resiliency for cloud workloads? (Select 3)(Required)