Privacy in Public Places

Does license plate information-gathering constitute an invasion of privacy?

As you drive down Robert E. Lee Memorial Highway in the Virginia suburbs of Washington, D.C., a police camera captures and reads your license plate. After checking to make sure that the owner of the car is not a wanted fugitive or that the car is not reported stolen, the Fairfax County, Virginia, Police Department then keeps a copy of the photograph of the license plate, as well as the GPS coordinates where the car was spotted. A matrix of the Automated License Plate Reader (ALPR) databases can tell where an individual (well, technically a car) was, with whom they were meeting, how long they were present and, depending on where they are placed, can be used to determine an individual’s political persuasion, medical diagnosis and treatment, employment status and a host of other personal information.

But, the Fairfax County Police Department claims the records are not “personal” and are not about any individual. The license plates are just a series of random letters and numbers “OUTATIME” and reveal nothing. They are issued by, and technically are the property of, the Commonwealth that issued them. They don’t show who was operating the car—at most, they show who is the registered owner. And besides, you voluntarily expose this information to the public when you drive around. How could anyone think that this information is private? How could anyone have a reasonable expectation of privacy in that information.

In a recent case, the Virginia Supreme Court rejected a lower court’s finding that “a license plate number is not personal information” under the Virginia privacy law because it refers to a vehicle rather than a person. The statute in question defined “personal information” to include “agency-issued identification number[s]” that “affords a basis for inferring personal characteristics.” The court recognized that license plates are random numbers and letters that don’t identify any personal characteristics, and that are assigned to vehicles and not individuals. The court also noted that more than one person may own a vehicle, and that vehicles can be owned by corporations or other entities.

In addition, the court took note of the fact that police were entitled to collect such data if it “specifically pertains to investigations and intelligence-gathering relating to criminal activity.” The police argued that they could collect and maintain the database of ALPR records for later investigative purposes or later intelligence-gathering purposes. The court recognized that the police could collect data for future investigative purposes or for future intelligence-gathering purposes—even for crimes that had not yet been committed, but that “the Police Department collected and retained personal information without any suspicion of criminal activity at any level of abstraction” and therefore that “the Police Department’s sweeping randomized surveillance and collection of personal information does not ‘deal’ with investigations and intelligence gathering related to criminal activity.” The court remanded the case to the lower court to determine whether such collection and use of such data by the police department constituted what the law calls an “information system,” and therefore whether the actions of the police violated the statute.

A few things to point out here. First, the court was not dealing with privacy generally, but with a specific Virginia statute that dealt with how government agencies could collect, store and use certain types of personal information. In the absence of such a statute, the only limitation on the collection and use of the data is general principles of privacy and the Fourth Amendment’s right of persons to be free in their “persons, papers, houses and effects” against “unreasonable searches and seizures.” The Virginia Supreme Court did not address this issue. The court also did not address the issue of whether the police could capture the data in the first place; whether there was any expectation of privacy; or whether the capture, storage and use of the data constituted a “search” in the first place. In fact, under this case, there is nothing to prevent a private entity from setting up its own ALPR systems and selling the databases to credit reporting agencies and repossession companies, private investigators, divorce lawyers or others. In fact, this is exactly what happens in the United States.

The case also did not discuss the concept of whether people have any expectation of privacy in public spaces. Traditionally, courts have held that when things happen in “open fields” or other public places, people have no expectation of privacy in what can be seen—even if you have to use a drone or surveillance plane to see it. That’s why the police need neither a warrant nor probable cause to track your car even by installing a beeper into a bottle which you voluntarily took with you, although they do need a warrant to affix a GPS tracker to your car. But the ALPR case is different. While the license plate is exposed to the public and reveals very little about the operator, the collection, storage and use of massive databases of such data reveals where everyone is (or at least infers it) at any given time. Tracking every vehicle at every time is very different from tracking a specific vehicle for a limited duration. And checking every vehicle to see if it is stolen is very different from keeping records permanently about where every vehicle has been.

What This Means for You

With the EU’s General Data Protection Regulation (GDPR) right around the corner, and Congress considering considering some more comprehensive privacy regulation in the United States, the case has significance in that it expresses a somewhat more expansive view of what constitutes “personal” information that requires some degree of protection. For commercial entities, it means that identifying information which can reveal information about a specific individual—such as cookie information, IP address or other identifiers (the ALPR records, for example)—can, in context, be considered to be personal information that requires legal, privacy and security protection. The case means that companies can’t assume that data is not personal or private, or that they can create or keep it just because they created it, or just because they felt they had a good reason to keep it.

The other thing the case holds is that personal data—even if collected in public places—can have privacy implications. Don’t assume that, just because you “found” that data in the public (e.g., internet traffic analysis, log monitoring, Google searches, etc.), the data has no privacy implications. If you are attempting to “profile” a specific person (or narrow category of individuals) or to target individuals, you are at risk of running afoul of both U.S. and international privacy laws (to the extent that there are U.S. privacy laws.) The problem in the United States is that, for example, a medical diagnosis is likely covered by HIPAA privacy laws, but a video surveillance outside an emergency room might not be—especially one run by a company across the street from the hospital that is not a “covered entity.” Library records of book rentals may be protected, but the cameras in the library may not be. In the United States it’s not always about what data is captured, but how it is captured.

The Virginia case is a tiny baby step for privacy. It recognizes that license plate numbers such as “JJZ 109” (California), “BDR 529” (Illinois), “FAB-1” (UK), “ECTO-1” (New York) or “MYPRSHE” (California) or “NRVOUS” (Illinois) reveal things about the owners or operators of the vehicles and their activities. In fact, that’s why this data is being collected in the first place. It doesn’t say whether or how the data can be collected and used—that is context-specific. It doesn’t say whether the data collection and use is appropriate or not. At least not yet. But it does reject the concept that a series of numbers and letters are not meaningful from a privacy perspective.

From a GDPR perspective, entities must first determine what personal data they have collected, or what they have access to. Only after that can you determine your legal basis for the collection, the scope and extent of the collection, and the degree of protection afforded to that data. Just remember that individual countries are likely to take a more expansive definition of personal data than you might. And don’t assume you are protected just because the data was collected in public.

Sponsored Content
Upcoming Webinar
Not All Flaws Are Created Equal: The Difference Between a Flaw, a Vulnerability and an Exploit

Not All Flaws Are Created Equal: The Difference Between a Flaw, a Vulnerability and an Exploit

According to Gartner, the application layer contains 90% of all vulnerabilities. However, do security experts and developers know what’s happening underneath the application layer? Organizations are aware they cannot afford to let potential system flaws or weaknesses in applications be exploited, but knowing the distinctions between these weaknesses can make ... Read More
May 29, 2018
Mark Rasch

Mark Rasch

Mark Rasch is a lawyer and computer security and privacy expert in Bethesda, Maryland. where he helps develop strategy and messaging for the Information Security team. Rasch’s career spans more than 25 years of corporate and government cybersecurity, computer privacy, regulatory compliance, computer forensics and incident response. He is trained as a lawyer and was the Chief Security Evangelist for Verizon Enterprise Solutions (VES). He is recognized author of numerous security- and privacy-related articles. Prior to joining Verizon, he taught courses in cybersecurity, law, policy and technology at various colleges and Universities including the University of Maryland, George Mason University, Georgetown University, and the American University School of law and was active with the American Bar Association’s Privacy and Cybersecurity Committees and the Computers, Freedom and Privacy Conference. Rasch had worked as cyberlaw editor for SecurityCurrent.com, as Chief Privacy Officer for SAIC, and as Director or Managing Director at various information security consulting companies, including CSC, FTI Consulting, Solutionary, Predictive Systems, and Global Integrity Corp. Earlier in his career, Rasch was with the U.S. Department of Justice where he led the department’s efforts to investigate and prosecute cyber and high-technology crime, starting the computer crime unit within the Criminal Division’s Fraud Section, efforts which eventually led to the creation of the Computer Crime and Intellectual Property Section of the Criminal Division. He was responsible for various high-profile computer crime prosecutions, including Kevin Mitnick, Kevin Poulsen and Robert Tappan Morris. Prior to joining Verizon, Mark was a frequent commentator in the media on issues related to information security, appearing on BBC, CBC, Fox News, CNN, NBC News, ABC News, the New York Times, the Wall Street Journal and many other outlets.

mark has 15 posts and counting.See all posts by mark