Face Off: Privacy Issues Not Confined to FaceApp

The internet and the security community is up in arms and shocked, shocked to see that a web developer is collecting data that you share with them and processing that data in the cloud. In this case, the app is FaceApp, an application developed by an entity in Russia that takes photos you upload and uses an algorithm in the cloud to “age” these pictures to show what you would look like if you were—well, if you were as old as me.

As with everything else, it’s all fun and games until someone gets hurt. In the case of FaceApp, the problem seems to be that the app’s privacy policy does not tell people that their photos (either the ones that they share or the ones that might be accessed by the app) are taken off the phone, processed in the cloud and potentially maintained by the developer. The developer reportedly indicated that the photos are processed in-country (not in Russia), that only those photos shared are processed and that almost all of the photographs (hmmm, almost all?) are deleted from the cloud server after processing.

In response to the criticism, FaceApp now contains the warning, “Cloud Photo Processing: Each photo you select for editing will be uploaded to our servers for imaging processing and face transformation.” So now, you click first and the same exact thing happens.

Cloud Smoud

Boiling down the controversy, it seems the issue was that users did not realize their pictures were being processed in the cloud as opposed to on some other server or on their own devices. Right. Like users of Siri, OK Google and Alexa don’t realize that their voice is processed in the cloud. The cloud can be good, not good or downright evil, from both a privacy and security standpoint. But it’s just another computer (or computer network). From a privacy standpoint, the issue is use—who “owns” the photo and what rights does the app developer have to use the photo? In this regard, FaceApp’s privacy policy, while by no means a model of clarity, seems to suggest that the company doesn’t sell, share or otherwise use the picture except for internal use. Whatever that is. Considering that Facebook stores, scans, indexes and uses facial recognition not only on pictures of me that I upload but also on pictures of me that others upload, FaceApp’s privacy policy—if it says what I think—ain’t terrible.

Security is another issue. Anytime you access a file, transfer it, process it, store it and re-transfer it, there are opportunities for abuse. So the fact that the processing is “in the cloud” could be an issue. Or not. We will see if people are dissuaded from using FaceApp and aging themselves ‘cause they are aging in the cloud.

Consent

The real problem here is that, at least in the United States, your privacy and security are not dictated by some standard of reasonableness, but primarily (except for certain defined data) by contract. In Europe and other places, there are standards for data collection, storage, processing and use—data must be collected for a “lawful purpose” and used that way.

In the U.S., it’s primarily a question of what the consumer agreed to. And I use the term “agreed to” loosely. If a Terms of Use, Terms of Service, Software License Agreement or Privacy policy puts you on notice that, by using the site you have agreed that your data can be—I don’t know—sold to Russian hackers, well, then expect it to be sold to Russian hackers. It’s not my fault that you didn’t read the fine print.

There are many problems with the contract model, not the least of which is the fact that most people can’t read, can’t understand and can’t negotiate these agreements. They just want to use the site, the app or the data. As Ellen Barkin’s Beth Schreiber said in Diner, “Shrevie, who cares about what’s on the flip side about the record?” They just want to listen to the music.

Years ago spyware developers got wise to this and started putting Software License Agreements in their spyware, getting unwitting downloaders to agree to the installation of malicious code and to the collection of data. The contract model would permit such a bargain.

In the end, whether your face is processed on the cloud, in the air, on a server or on a device, the processing should be secure, and the use of your pictures and data should be reasonable, irrespective of what the fine print says. Some rights are too precious to be signed away with a click.

Featured eBook
Open Source Security: Weighing the Pros and Cons

Open Source Security: Weighing the Pros and Cons

Over the past few years, open source has grown in popularity, especially among developers using open source code in their application development efforts. Open source software offers incredible benefits to enterprises IT and development efforts. Free, available software libraries mean cost savings, easy customization, speed, agility and flexibility for development and IT teams. There are ... Read More
Security Boulevard
Mark Rasch

Mark Rasch

Mark Rasch is a lawyer and computer security and privacy expert in Bethesda, Maryland. where he helps develop strategy and messaging for the Information Security team. Rasch’s career spans more than 35 years of corporate and government cybersecurity, computer privacy, regulatory compliance, computer forensics and incident response. He is trained as a lawyer and was the Chief Security Evangelist for Verizon Enterprise Solutions (VES). He is recognized author of numerous security- and privacy-related articles. Prior to joining Verizon, he taught courses in cybersecurity, law, policy and technology at various colleges and Universities including the University of Maryland, George Mason University, Georgetown University, and the American University School of law and was active with the American Bar Association’s Privacy and Cybersecurity Committees and the Computers, Freedom and Privacy Conference. Rasch had worked as cyberlaw editor for SecurityCurrent.com, as Chief Privacy Officer for SAIC, and as Director or Managing Director at various information security consulting companies, including CSC, FTI Consulting, Solutionary, Predictive Systems, and Global Integrity Corp. Earlier in his career, Rasch was with the U.S. Department of Justice where he led the department’s efforts to investigate and prosecute cyber and high-technology crime, starting the computer crime unit within the Criminal Division’s Fraud Section, efforts which eventually led to the creation of the Computer Crime and Intellectual Property Section of the Criminal Division. He was responsible for various high-profile computer crime prosecutions, including Kevin Mitnick, Kevin Poulsen and Robert Tappan Morris. Prior to joining Verizon, Mark was a frequent commentator in the media on issues related to information security, appearing on BBC, CBC, Fox News, CNN, NBC News, ABC News, the New York Times, the Wall Street Journal and many other outlets.

mark has 46 posts and counting.See all posts by mark