HIPAA Security Requirements: What They Really Mean

The University of Texas M.D. Anderson Cancer Center was having a hard time protecting patient electronic health information. In 2012, an employee’s laptop, containing ePHI for about 30,000 patients was stolen. The same year, a trainee lost an unencrypted thumb drive with ePHI for about 2,000 people during her evening commute and in 2013, a visiting researcher misplaced another unencrypted thumb drive which contained ePHI for about 3,600 people. There was no evidence that any of the lost devices were used, or that the ePHI was accessed by anyone, but the state-run cancer center clearly failed to protect the data, and had failed to encrypt these records.

The Department of Health and Human Services investigated Anderson for violations of HIPAA and HITECH laws and regulations, specifically arguing that the cancer center failed, under HITECH, to “[i]mplement a mechanism to encrypt” ePHI or adopt some other “reasonable and appropriate” method to limit access to patient data, 45 C.F.R. §§ 164.312(a)(2)(iv), 164.306(d) (the “Encryption Rule”), and that they improperly “disclosed” ePHI under the HITECH “disclosure” rule, 45 C.F.R. § 164.502(a).

HHS imposed a fine of $4,348,000 USD against Anderson, and administrative and court appeals followed. On January 14, 2020, the United States Court of Appeals for the Fifth Circuit (which includes Texas) found that HHS findings, specifically that the hospital had no “mechanism to encrypt” health records, and that they improperly “disclosed” these records, was arbitrary and capricious, and reversed the fines. The federal appeals court distinguished between a failure of encryption and a failure to have a mechanism to encrypt, noting that a company could have a bulletproof encryption procedure, and encrypt thousands of computers and millions of thumb drives, and still inadvertently fail to encrypt a few drives which would result in a security breach. This would, in the opinion of the court, not constitute a wholesale failure to have a mechanism for encryption — it would be a failure to have a “perfect” mechanism for encryption — something the law and regulation does not require. As the Court described it:

The dispute in this case is whether M.D. Anderson should’ve done more—either to implement a different mechanism or to better implement its chosen mechanism. The Government adamantly argues yes. …But it’s plainly irrational to say that M.D. Anderson’s desire to do more in the future means that in the past it “failed to encrypt patient data on portable media at all. … “[N]othing in HHS’s regulation says that a covered entity’s failure to encrypt three devices means that it never implemented “a mechanism” to encrypt anything at all. … The regulation requires only “a mechanism” for encryption. It does not require a covered entity to warrant that its mechanism provides bulletproof protection of “all systems containing ePHI.” Nor does it require covered entities to warrant that all ePHI is always and everywhere “inaccessible to unauthorized users.”

On the issue of whether the cancer center improperly “disclosed” or “released” ePHI in violation of HIPAA and HITECH, the Court took issue with the way the administrative law judge (ALJ) disagreed with “the conclusion that ‘any loss of ePHI is a ‘release,’” even if the covered entity did not act to set free anything. It defies reason to say an entity affirmatively acts to disclose information when someone steals it. That is not how HHS defined “disclosure” in the regulation. Moreover, without knowing the identity of the person who stole the data, the Court concluded that HHS could not prove that the thief was outside the organization, and, therefore, not entitled to see the data.

Additionally, the Court found the manner in which HHS applied the regulations was so inconsistent as to be arbitrary and capricious – in cases of more egregious losses of data, imposing no fines or penalties whatsoever. HHS could provide no reasonable justification for the disparity, just to assert that it considered each case on a “case-by-case” basis. That wasn’t good enough for the Court.

Finally, the Court addressed the nature and scope of the fines. The court noted that the cancer center’s loss of data was due to “reasonable cause” and not “willful neglect” 42 U.S.C. § 1320d5(a)(1)(B), and that, by statute, “the total amount imposed on the person for all such violations of an identical requirement or prohibition during a calendar year may not exceed
$100,000” 42 U.S.C. § 1320d-5(a)(3)(B). In assessing the fines, regulations under 45 C.F.R. § 160.408(b) required HHS to consider (1) Whether the violation caused physical harm; (2) Whether the violation resulted in financial harm; (3) Whether the violation resulted in harm to an individual’s reputation; and (4) Whether the violation hindered an individual’s ability to obtain health care. Not only did HHS fail to prove any of these, they failed to even consider them. The Court sent the case back to the ALJ and vacated the $4 million+ fine.

The Takeaway

For privacy and security regulations to have ‘teeth’ there must be the possibility of enforcement; fines and penalties. It’s not sufficient to say, “We usually encrypt,” or “We often encrypt,” or “We frequently protect data.” The problem with security has always been that the defender must defend against all possible attacks, while the attacker need be successful only once. To a patient whose records have been compromised, it makes no logical sense to tell them, “Sure, but we protected other data, so why complain?” The HIPAA/HITECH laws and regulations certainly don’t mandate perfect security, but losing three devices in the course of a few months – none of which contained data encrypted at all – certainly doesn’t look good. And, if you have a “system of encryption” that doesn’t actually require that computers and thumb drives are encrypted, what you really have is a policy – not a system.

On the other hand, and I can’t say this emphatically enough, there is an erroneous assumption that every data breach involving ePHI is a HIPAA violation, and that every “loss of control” of data is an improper disclosure of ePHI. That was never the intention of the law, or of the regulation. The goal was to have healthcare entities understand the sensitive nature of medical records, and take reasonable (not perfect) steps to protect it from improper disclosure and use. It was not a guarantee of privacy, or a guarantee of absolute security. The standard was, at least at first, to “do something” (since many providers were doing nothing at all), and then “do the reasonable thing,” and next, “do the right thing,” and finally, “do the best thing.” The Court’s decision reflects the position that the HHS encryption and disclosure rules require covered entities to “do the reasonable thing” but not that they do “everything.” If you have a reasonable procedure for encryption that uses both reasonable technologies and reasonable enforcement, and people are reasonably trained in the technology and on how to deploy it, not every failure to follow it is a failure to implement encryption.

The difficulty for both regulators and regulated entities is in determining what is “good enough.” The problem right now is we have “audit-based” or “breach-based” security for regulated entities. If there is a breach, then there is a “disclosure,” and, therefore, a violation of HIPAA/HITECH. It’s a res ipsa loquitur argument. This forces entities to focus on data breaches, rather than holistic security, and often makes security the enemy of functionality. Similarly, we conduct constant HIPAA audits on policies, procedures, etc., looking for violations. So, which is worse – a major, unaddressed vulnerability that has not led to a data breach, or a minor policy violation which has? Which is a worse HIPAA violation? Which will lead to greater fines or penalties? And what about the entity that has a major vulnerability, but doesn’t know it because it has never performed or been subject to audit? Unless and until it knows it has had a data breach, HHS may never know about the vulnerability and the HIPAA violation.

HHS needs to have the power to impose fines for true violations. Sometimes, these fines need to be severe and consequential. Mere failures of security – even when they have bad results – should result in orders to compensate the privacy victims, not necessarily pay off HHS. But willful, deliberate and repeated failures to do the basic things – even when no breach occurs – should permit HHS to bring down the hammer. That, certainly, is consistent with what Congress intended when it passed HIPAA and HITECH. We will see how the Courts continue to interpret it.

Avatar photo

Mark Rasch

Mark Rasch is a lawyer and computer security and privacy expert in Bethesda, Maryland. where he helps develop strategy and messaging for the Information Security team. Rasch’s career spans more than 35 years of corporate and government cybersecurity, computer privacy, regulatory compliance, computer forensics and incident response. He is trained as a lawyer and was the Chief Security Evangelist for Verizon Enterprise Solutions (VES). He is recognized author of numerous security- and privacy-related articles. Prior to joining Verizon, he taught courses in cybersecurity, law, policy and technology at various colleges and Universities including the University of Maryland, George Mason University, Georgetown University, and the American University School of law and was active with the American Bar Association’s Privacy and Cybersecurity Committees and the Computers, Freedom and Privacy Conference. Rasch had worked as cyberlaw editor for SecurityCurrent.com, as Chief Privacy Officer for SAIC, and as Director or Managing Director at various information security consulting companies, including CSC, FTI Consulting, Solutionary, Predictive Systems, and Global Integrity Corp. Earlier in his career, Rasch was with the U.S. Department of Justice where he led the department’s efforts to investigate and prosecute cyber and high-technology crime, starting the computer crime unit within the Criminal Division’s Fraud Section, efforts which eventually led to the creation of the Computer Crime and Intellectual Property Section of the Criminal Division. He was responsible for various high-profile computer crime prosecutions, including Kevin Mitnick, Kevin Poulsen and Robert Tappan Morris. Prior to joining Verizon, Mark was a frequent commentator in the media on issues related to information security, appearing on BBC, CBC, Fox News, CNN, NBC News, ABC News, the New York Times, the Wall Street Journal and many other outlets.

mark has 203 posts and counting.See all posts by mark