VA High Court: License Plate Database Not Personal Data

Regulations related to the collection, storage and use of personal data don’t apply to the collection of license plate readings, a court has found, calling privacy regs into question

As you drive to George Mason University in Fairfax, Virginia, you may very well pass a blue and grey Fairfax County police car with its shiny lights and trunk-mounted Automated License Plate Reader (ALPR). The camera will take a picture of your license plate; scan it; analyze it; “read” the letters, numbers and state of issue; and compare it against a “hot list” of wanted or stolen cars or determine whether the owner of the vehicle is “wanted” and therefore stop the driver under the assumption that they might be the wanted owner.

But even if you are not driving, the Fairfax police can simply passively scan license plates of moving and parked cars, together with pictures of the cars themselves, as well as the precise GPS location of where that car (well, license plate) was at the time the image was captured. While the license plate itself tells the police nothing about the driver, owner or operator (it’s just a number), when combined with other databases including NCIC/VIN or other DMV databases, the police can track where any person whose license plate was captured was on any particular day. You can do this either by entering the license plate number into the ALPR database, seeing every time it appears and then running the plate through DMV and finding out who the owner is; or, knowing the owner (suspect), finding through DMV records the license plate numbers of any vehicles they own and then pinging the ALPR database. The more license plates captured and the longer the database is maintained, the more “useful” it is to police, but the more invasive of privacy. Currently, Fairfax County Virginia keeps its ALPR records for a year (minus a day).

On Oct. 22, the Virginia Supreme Court ruled that the regulations related to government agencies’ collection, storage and use of “personal information” don’t apply to the collection of the ALPR data because that specific database does not contain “the name, personal number, or other identifying particulars of a data subject” and is therefore not an “information system” as defined in Virginia law. The court relied on the lower court finding that “the ALPR record-keeping process does not itself gather or directly connect to ‘identifying particulars’ of a vehicle owner. …” So, even though the police collect the license plate number, the state of issue, the make and model of the car and the physical location where the car is located (say, in your driveway), this data set is not “personally identifying.” The same analysis might apply to things such as a MAC address, an IP address, a phone number, a physical address or even a Social Security number. None of these specific data points “identify” a specific individual without reference to some other database. Facial recognition software that merely captures, digitizes and analyzes faces does not “identify” the person until it runs against a database with the person’s name, right? Fingerprint cards alone are not “personally identifiable” until they are run through a database to match them to a specific person.

The Virginia Supreme Court goes further. Even when the identifying database is pinged—the DMV or the NCIC database—to identify the specific individual, this does not render the ALPR database a system of record-keeping because—and follow me here—the “record” (or the “hit”) is just seen by the police. It’s not “kept” (e.g., not preserved, stored, maintained). So now that the record is a personal record (we know what you did last summer), it’s not “kept,” so no problemo! While the DMV or NCIC databases may be record-keeping systems subject to the privacy and data purging requirements, the ALPR database is not. Oh, and that “hot list” of wanted cars? It’s also not a record of personal information, since it only has the license plate numbers of cars, not the names of the owners.

The case raises a host of privacy-related questions for both government entities and private businesses concerning what exactly is “personally identifiable information.” While the statute at issue in the Virginia court was not a “privacy” statute per se, it was triggered by the question of whether the data contained in it was personally identifiable. Other statutes including data breach statutes, GDPR, CCPA and other privacy statutes also look at whether data identifies a specific individual.

One analogy would be a tracking cookie: If you go to a website that puts a tracking cookie on your browser, the entity that sent the cookie (let’s call them the cookie monster) can know your IP address, when you visited their site, your approximate location, browser configs and other data when you visited. They can then track your activities through other websites, where you visit and what you do. But—and applying the rationale used by the Virginia Supreme Court here—they don’t know who you are (at least in this hypothetical). They may know what you like to buy, what you like to eat, what you shop for, what you read, where you are, where you live, your address, your telephone number and your credit card number, but they don’t know (without reference to some other database) who you are. So, applying the Virginia rationale, no harm, no foul. It’s not “personal” information. Right?

Not so fast, kemosabe.

It’s highly unlikely that a court applying privacy laws such as GDPR or CCPA would take such a narrow interpretation of “personal information.” For example, under the CCPA, “personal information” means information that identifies, relates to, describes, is reasonably capable of being associated with or could reasonably be linked directly or indirectly with a particular consumer or household.

By way of example, the CCPA notes that personal information includes, but is not limited to, the following if it identifies, relates to, describes, is reasonably capable of being associated with or could be reasonably linked, directly or indirectly, with a particular consumer or household:

  • Identifiers such as a real name, alias, postal address, unique personal identifier, online identifier, internet protocol address, email address, account name, Social Security number, driver’s license number, passport number or other similar identifiers.
  • Any categories of personal information described in subdivision (e) of Section 1798.80.
  • Characteristics of protected classifications under California or federal law.
  • Commercial information, including records of personal property, products or services purchased, obtained or considered, or other purchasing or consuming histories or tendencies.
  • Biometric information.
  • Internet or other electronic network activity information, including, but not limited to, browsing history, search history and information regarding a consumer’s interaction with an internet website, application or advertisement.
  • Geolocation data.
  • Audio, electronic, visual, thermal, olfactory or similar information.
  • Professional or employment-related information.
  • Education information, defined as information that is not publicly available personally identifiable information as defined in the Family Educational Rights and Privacy Act (20 U.S.C. Sec. 1232g; 34 C.F.R. Part 99).
  • Inferences drawn from any of the information identified in this subdivision to create a profile about a consumer reflecting the consumer’s preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities and aptitudes.

While “personal information” does not include consumer information that is de-identified or aggregate consumer information, that information must be effectively de-identified; that is, in a way that it’s not readily able to be linked to a specific individual. Keeping the de-identified information on a separate but easily accessible database that will turn the de-identified data into identifiable information won’t cut it.

In fact, if the ALPR data is not linked to personally identifiable information, it is of very limited use. Sure, you can tell if a license plate is the same as one that is reported stolen, but that’s hardly why Fairfax County keeps the ALPR records for a year. They use it as an investigative tool. They want to know where a human being is or was, not just a hunk of metal. They don’t want to just recover the vehicle, they want to find the person—and to do that, they have to convert the numerical code that is the license plate number onto a name, address and photograph. That’s kind of the point. But, according to the Virginia Supreme Court, as long as the cop has to enter a few extra keystrokes to do that, everything is hunky-dory.

My advice? If you’re ever in Northern Virginia and want to protect your privacy, wear a mask (and a hat and glasses) and take a bus. Or go for a stroll. Or maybe just stay inside with a cover over your parked car.

Avatar photo

Mark Rasch

Mark Rasch is a lawyer and computer security and privacy expert in Bethesda, Maryland. where he helps develop strategy and messaging for the Information Security team. Rasch’s career spans more than 35 years of corporate and government cybersecurity, computer privacy, regulatory compliance, computer forensics and incident response. He is trained as a lawyer and was the Chief Security Evangelist for Verizon Enterprise Solutions (VES). He is recognized author of numerous security- and privacy-related articles. Prior to joining Verizon, he taught courses in cybersecurity, law, policy and technology at various colleges and Universities including the University of Maryland, George Mason University, Georgetown University, and the American University School of law and was active with the American Bar Association’s Privacy and Cybersecurity Committees and the Computers, Freedom and Privacy Conference. Rasch had worked as cyberlaw editor for SecurityCurrent.com, as Chief Privacy Officer for SAIC, and as Director or Managing Director at various information security consulting companies, including CSC, FTI Consulting, Solutionary, Predictive Systems, and Global Integrity Corp. Earlier in his career, Rasch was with the U.S. Department of Justice where he led the department’s efforts to investigate and prosecute cyber and high-technology crime, starting the computer crime unit within the Criminal Division’s Fraud Section, efforts which eventually led to the creation of the Computer Crime and Intellectual Property Section of the Criminal Division. He was responsible for various high-profile computer crime prosecutions, including Kevin Mitnick, Kevin Poulsen and Robert Tappan Morris. Prior to joining Verizon, Mark was a frequent commentator in the media on issues related to information security, appearing on BBC, CBC, Fox News, CNN, NBC News, ABC News, the New York Times, the Wall Street Journal and many other outlets.

mark has 203 posts and counting.See all posts by mark