Waymo Sues to Keep Auto Algorithm Trade Secret

Cyberdyne Systems has sued to keep its Skynet algorithm from being publicly disclosed, claiming it is a trade secret. Wait, what?

No. Cyberdyne Systems did not sue to keep the algorithm for its Skynet system from public disclosure. (And for those of you who don’t know, this is a reference to the Terminator movie franchise.) It would be too late for that, anyway. But Alphabet subsidiary Waymo—the company’s self-driving car subsidiary—last week filed a lawsuit in state court to keep the algorithm for its self-driving cars secret and to prevent the DMV from accessing or releasing the data.

Increasingly, our lives are governed by algorithms. Where police are stationed, how they interact with civilians, how they are armed and what they do are dictated by algorithms. Algorithms are used to determine whether you get hired, whether you qualify for a loan to buy a house, your mortgage interest rate, whether you qualify for a car loan and how much you pay for car insurance. Algorithms dictate what ads you see, what prices you pay and what your entire online experience looks like. Algorithms determine whether you will be targeted by white nationalists, QAnon or Black Lives Matter. Algorithms also determine your propensity for violence, whether you’ll get bail if you are arrested, and what sentence will be imposed if you are convicted. To a great extent (and that extent keeps getting greater) our lives are determined by algorithms.

If you are discriminated against because of the way the algorithm treats you, you might be inclined to seek legal redress. Good luck with that. A few years ago, a Wisconsin criminal defendant named Eric Loomis was sentenced based on his score on an algorithm-driven tool called COMPAS that purported to measure his likelihood of reoffending. Loomis challenged the sentence because neither he nor the sentencing judge had any idea how the COMPAS program operated. Sure, they knew what kinds of factors the algorithm considered—age, education, criminal history, nature of the offense, etc.—but the way the computer factored these things—how it weighted these things—was a black box. Intentionally. Because the COMPAS algorithm was a proprietary trade secret of the company that licensed it.

So, if a person is repeatedly being harassed by the police because of their race and frequently arrested (even if the charges are dropped), the COMPAS program may look at the frequent arrests and determine there’s a high likelihood of recidivism and that the person poses a danger, whereas a human equipped with the facts might take that information as a reflection of the neighborhood and the society in which the person lived.

Moreover, the algorithm may end up becoming a self-fulfilling prophecy. If Loomis is sentenced to prison because the algorithm thinks he is dangerous, when he is released he will have a harder time finding a job—and a greater likelihood of resorting to crime. Algorithm vindicated! That’s the thing about algorithms—they may be great; they may not be. What they cannot be is unchallengeable. And making these algorithms that control our lives unchallengeable (and undiscoverable) trade secrets will have real consequences.

Similarly, Google’s (I mean, Alphabet’s) self-driving car division, Waymo, is attempting to keep evidence of how its algorithms have led to crashes and/or deaths secret from the California DMV, alleging that the way the self-driving car algorithm works is a proprietary trade secret of Alphabet/Waymo.

In Waymo LLC vs. California Department of Motor Vehicles, Dkt. No. 34-2022-80003805-CU-WM-GDS, filed in California superior court in Sacramento County on January 21, 2022, Waymo filed a lawsuit against the California DMV seeking a writ of mandate whereby the court would order the DMV not to disclose what it called trade secrets about things like details of the design of the self-driving car, how it is deployed, how the algorithms work, how it avoids obstacles and the like. This is significant because of what can be called the AI trolley problem—when faced with a choice between killing or injuring the driver (that is, the Waymo customer) and a pedestrian (presumably not a Waymo customer) the algorithm must be programmed to make that ethical choice. The trolley problem. By embedding that choice in the algorithm, the programmers have made a choice. But neither the driver nor the pedestrian knows what that choice is. And the algorithm makes millions of these choices—some by design, some by experience. Without access to information about how the algorithm works, we won’t know the difference between Optimus Prime and the Terminator movie’s hunter-killer tank.

We need to strike a balance between providing protection for the work and intellectual property of developers and knowing what the algorithms are doing and why. Remember, Skynet became self-aware on August 29, 1997. If we wait for these cars to be ubiquitous and on the road before we determine how they work and whether they are safe, we may all have to face a judgment day.

Avatar photo

Mark Rasch

Mark Rasch is a lawyer and computer security and privacy expert in Bethesda, Maryland. where he helps develop strategy and messaging for the Information Security team. Rasch’s career spans more than 35 years of corporate and government cybersecurity, computer privacy, regulatory compliance, computer forensics and incident response. He is trained as a lawyer and was the Chief Security Evangelist for Verizon Enterprise Solutions (VES). He is recognized author of numerous security- and privacy-related articles. Prior to joining Verizon, he taught courses in cybersecurity, law, policy and technology at various colleges and Universities including the University of Maryland, George Mason University, Georgetown University, and the American University School of law and was active with the American Bar Association’s Privacy and Cybersecurity Committees and the Computers, Freedom and Privacy Conference. Rasch had worked as cyberlaw editor for SecurityCurrent.com, as Chief Privacy Officer for SAIC, and as Director or Managing Director at various information security consulting companies, including CSC, FTI Consulting, Solutionary, Predictive Systems, and Global Integrity Corp. Earlier in his career, Rasch was with the U.S. Department of Justice where he led the department’s efforts to investigate and prosecute cyber and high-technology crime, starting the computer crime unit within the Criminal Division’s Fraud Section, efforts which eventually led to the creation of the Computer Crime and Intellectual Property Section of the Criminal Division. He was responsible for various high-profile computer crime prosecutions, including Kevin Mitnick, Kevin Poulsen and Robert Tappan Morris. Prior to joining Verizon, Mark was a frequent commentator in the media on issues related to information security, appearing on BBC, CBC, Fox News, CNN, NBC News, ABC News, the New York Times, the Wall Street Journal and many other outlets.

mark has 203 posts and counting.See all posts by mark