The High Cost of Privacy By Default

In the ongoing “war” between Facebook and Apple over privacy, Apple’s new operating system, iOS 14.5 contains a feature that most people assumed—incorrectly—was already part of the operating system; the ability to choose which apps collected and shared personal information about them.

For a long time, Apple OS users have had the ability to tweak their privacy settings to decide, for example, whether a particular app can track their location, or collect certain information, but did not have the level of granularity to be able to shut off data collection from an app. The new version of Apple’s OS has a feature called App Tracking Transparency (ATT) which differs from other “privacy” apps in that it forces applications to specifically ask for users to consent to the collection of data across apps and platforms for iPhone and iPad apps and services. It’s not quite “privacy by default,” but it is similar.

For an app like Facebook to collect data from non-affiliated apps (other than, say, Instagram), the new version of the Apple OS will ask the user to consent to the data sharing. So an iPhone or iPad user will be asked, “Facebook would like to obtain data from [insert app here; Amazon, eBay, etc.]. Would you like to share this data?”

Now, Facebook has been collecting this type of data for years, and has always proclaimed that the collection of this data is in its customers’ interests, that it delivers “targeted” ads to them and, most importantly, that this is precisely what their customers want.

But when the default option is that the data is not collected, and there’s an infinitesimally small burden on opting in to such data sharing, we are learning that people simply don’t opt in. This is in spite of the fact that Facebook is actively marketing the advantages of opting in and data sharing—namely, that opting in provides “targeted” ads to users, and a “more personal” and “more meaningful” experience. Facebook tells its customers that “sharing your activity from other apps and websites allows Facebook to: show you ads that are more personalized, help keep Facebook free of charge [and] support businesses that rely on ads to reach their customers.”

Apparently, users don’t care. Or, at least, they don’t care that much. Or, they are simply lazy. When protecting privacy is the default, and people have to do something—even something trivial—to permit data collection or use, they don’t do it. Fewer than 4% of new iOS users are choosing to opt in to the cross app data collection by Facebook.

In addition, the privacy enhanced (sort of) communications medium Signal one-upped Facebook, reportedly testing an app that will display to users exactly why Facebook is delivering ads to them, noting things like “You got this ad because you are [a newlywed pilates instructor] and you’re [cartoon crazy.] The ad used your location to know you are in [LaJolla]. You’re into [parenting blogs] and thinking about [LGBTQ adoption].”

This set up a brouhaha between Signal and Facebook; Signal claimed that Facebook blocked the app, and Facebook both denied that it had blocked the app (except to the extent that it revealed things like sexual identity, which is prohibited under Facebook policies) and alleged that the Signal app was just a publicity stunt.

Right now, it’s hard to know what Facebook “knows” about you—or what it thinks it knows about you. Some years ago, personal data mining companies started to permit users to run a “privacy report” on themselves, much the same way you could run a credit report on yourself. Such a report might state that you are single, living in a high-income inner city location, have a post-graduate education, an upper middle class income, enjoy golf, tennis and sailing, and have a collection of vintage comic books. It also permitted individuals to “correct” their privacy report, or add to it—noting, for example, that you’re not into sailing, but rowing, and that you also enjoy fine wines and are a foodie. The “correcting” of the data report was actually another way of collecting more personal information from you and about you, and making the report even more valuable to the data broker. So, if people knew what Facebook thought it knew about them, I’m not sure that would be a bad thing for Facebook. While some people might be outraged by the scope and extent of data collection, others might just shrug their shoulders, and say, “Yup, that’s me…”

The idea behind privacy is to give people a choice about whether or not to make disclosures and permit uses. Another problem is that people rarely see the dark side of data collection. While being profiled as an educated, affluent, urban professional seems pretty cool, it also means that you are more likely to be willing to pay a bit extra for, say, a tube of toothpaste or a high-end colander. This means when you log into a website or search engine looking for either product, the prices you see are tailored not only to your zip code, but to you personally.

While this makes sense for the economy as a whole, it seems just unfair and undemocratic. And it can be discriminatory based on things like race, gender, sexual orientation or proxies for these things. “Tailored” ads means that companies send ads to those most likely to buy their products, based on their profile and interest, or that the ads received for the product have been manipulated to attempt to appeal to the targeted individual. We have all experienced the spooky feeling of talking about something unusual—like, maybe, pogo sticks, or bass fishing and suddenly seeing ads related to that concept show up in Facebook or on Amazon—as if the computer or your device is listening to you. Isn’t that right, Alexa? Siri? Google?

So, the new Apple OS change is more protective of privacy, and requires a bit of an opt-in, isn’t that a good thing? Maybe. Maybe not.

Privacy and control is good. In fact, in many ways the iOS user controls do not go far enough—relying on simple opt-in rather than an actual knowing and intelligent waiver, or better yet, a determination that the collection is not only consensual but is objectively reasonable; that the use is reasonable, that the data collected is the “minimum necessary” to fulfill a legitimate purpose and that the data is kept for the minimum amount of time necessary to fulfill a legitimate function and to secure that data. None of this is required by the iOS. Just the ability to move a mouse cursor and click the words “I agree.”

But there’s a dark side to privacy. We have established an entire economy and infrastructure which is based on invading the privacy of users. It’s why you haven’t recently written a check to Google. Or to Facebook. Or to Instagram. Or to thousands of other apps that you use every day. That’s because, as we all know, if you find a useful service, and you aren’t charged for it, then you are the product. Keep that in mind.

If, instead of asking users whether they want to keep their data private or if they want, instead, to share it with Facebook, ultimately the question may be, “Do you want to keep your data private, or would you rather pay $11.99 a month for Facebook?” If that was the question, you might find more people opting in. Or maybe not. I think time may tell.

Avatar photo

Mark Rasch

Mark Rasch is a lawyer and computer security and privacy expert in Bethesda, Maryland. where he helps develop strategy and messaging for the Information Security team. Rasch’s career spans more than 35 years of corporate and government cybersecurity, computer privacy, regulatory compliance, computer forensics and incident response. He is trained as a lawyer and was the Chief Security Evangelist for Verizon Enterprise Solutions (VES). He is recognized author of numerous security- and privacy-related articles. Prior to joining Verizon, he taught courses in cybersecurity, law, policy and technology at various colleges and Universities including the University of Maryland, George Mason University, Georgetown University, and the American University School of law and was active with the American Bar Association’s Privacy and Cybersecurity Committees and the Computers, Freedom and Privacy Conference. Rasch had worked as cyberlaw editor for SecurityCurrent.com, as Chief Privacy Officer for SAIC, and as Director or Managing Director at various information security consulting companies, including CSC, FTI Consulting, Solutionary, Predictive Systems, and Global Integrity Corp. Earlier in his career, Rasch was with the U.S. Department of Justice where he led the department’s efforts to investigate and prosecute cyber and high-technology crime, starting the computer crime unit within the Criminal Division’s Fraud Section, efforts which eventually led to the creation of the Computer Crime and Intellectual Property Section of the Criminal Division. He was responsible for various high-profile computer crime prosecutions, including Kevin Mitnick, Kevin Poulsen and Robert Tappan Morris. Prior to joining Verizon, Mark was a frequent commentator in the media on issues related to information security, appearing on BBC, CBC, Fox News, CNN, NBC News, ABC News, the New York Times, the Wall Street Journal and many other outlets.

mark has 203 posts and counting.See all posts by mark