Germany Orders Facebook to Stop Collecting Non-Facebook Data (and Why it Doesn’t Matter)

On Feb. 7, the German Bundeskartellamt (the Federal Cartel Office)—the equivalent to the U.S. Department of Justice’s Antitrust Division—ordered Facebook to change the way it does business, at least in Germany. The order noted that Facebook was a dominant force in both social media and in data colletion, and therefore imposed special obligations on Facebook’s collection and use of data.

At issue was the ability of Facebook to collect, store, process and effectively “integrate” non-Facebook data and quasi-Facebook data into its algorithms that it sells or uses for advertisers to know stuff about users. Essentially, there are three buckets of data. Facebook data—that is, the stuff we do when we are on the Facebook site including postings, searches, likes, replies, etc.; quasi-Facebook data—that is, data related to our use of sites owned and operated by Facebook but not the “Facebook” site itself, such as WhatsApp and Instagram; and internet information—that is, the information shared with Facebook when you are using the web. That’s the weird, creepy stuff—like you search Amazon for tennis rackets and your Facebook feed is now full of stringed rackets.

The German Cartel Office effectively told Facebook that, if it wants to use either internet information or WhatsApp or Instagram data as part of its Facebook analytics, pursuant to the GDPR, Facebook must have effective consent from users.

What Is Consent?

There’s the rub. Facebook already gets consent. Kinda, sorta. Users already have the option in their settings and features to decide whether to share data such as this. It’s just a few simple clicks away. Um … first, go to Settings—or is it Privacy, then Settings? Wait … it’s Profile, Settings, Privacy, How People Find and Contact You, and then follow the prompt, “Do you want search engines outside of Facebook to link to your profile?” No, wait. That’s not it.

Suffice it to say, it’s more than trivial to figure out whether your non-Facebook conduct is being shared with Facebook, what that information is and how to turn it on or off, both globally (Never share this data with Facebook) and individually (Hey, Facebook, I don’t want you to know about this specific search).

GDPR is all about having a lawful basis for collecting data. Consent is one potential lawful basis for collecting data. When I use Facebook, and I type in a comment or post, I have consented for Facebook to post that data. I set up whether I want that post to be viewable by everyone, just by friends or a few other settings. But Facebook’s business model is based on collecting as much data as it can about me and sharing it (directly or indirectly) with advertisers and marketers, who will spend big bucks to know whether I am an urban millennial or an aging boomer. The key to lawful collection is consent.

But Facebook does get consent to collect and use this information. So what’s the problem?

Not so fast, mein freund.

The German Cartel office explained:

“Facebook will no longer be allowed to force its users to agree to the practically unrestricted collection and assigning of non-Facebook data to their Facebook user accounts. The combination of data sources substantially contributed to the fact that Facebook was able to build a unique database for each individual user and thus to gain market power. In future, consumers can prevent Facebook from unrestrictedly collecting and using their data. The previous practice of combining all data in a Facebook user account, practically without any restriction, will now be subject to the voluntary consent given by the users. Voluntary consent means that the use of Facebook’s services must not be subject to the users’ consent to their data being collected and combined in this way. If users do not consent, Facebook may not exclude them from its services and must refrain from collecting and merging data from different sources.”

So now Facebook will have to get specific and voluntary consent from its users to collect and combine and use data from non-Facebook sources. And, if you don’t agree, Facebook cannot “punish” you by preventing you from using its services (although the demand is not clear whether Facebook could explain that the utility of those services might be impaired by Facebook’s inability to provide “targeted” services).

No problem. Here’s a typical internet “I agree” box. Or, “By continuing to use WhatsApp, Instagram, Facebook and the internet, I consent to ….”

Here’s where it gets weird. The Cartel office says that this won’t work, either. The head of the office noted:

“As a dominant company Facebook is subject to special obligations under competition law. In the operation of its business model the company must take into account that Facebook users practically cannot switch to other social networks. In view of Facebook’s superior market power, an obligatory tick on the box to agree to the company’s terms of use is not an adequate basis for such intensive data processing. The only choice the user has is either to accept the comprehensive combination of data or to refrain from using the social network. In such a difficult situation the user’s choice cannot be referred to as voluntary consent.”

The Tick

So the German regulators say that an obligatory tick is not an adequate basis for intensive data processing. Sort of like if you are going to consent to donating an organ (especially if you are still alive), you might want more than a tick box on the driver’s license.

But here’s the problem. If an “obligatory tick” is insufficient to provide effective notice and meaningful consent (and let’s face it, people don’t read privacy policies or terms of use and they certainly don’t understand the implications of that consent), then how exactly is Facebook going to get consent to collect, merge and use the data? Engraved invitations?

The law treats these clickwrap agreements as binding contracts and generally says that a person’s failure to read them does not excuse them from the obligations. And the fact that they are non-negotiable also doesn’t mean that they are not enforceable. These “I agree” boxes are the main means companies online have to put limits on what people can and cannot do on their websites, establish expectations of privacy and, yes, obtain consent to collect and use information. So what’s the alternative for Facebook to the “obligatory tick?”

TL:DR

Facebook could spell out its policy and get express consent to each of its terms. It could say, “Hey, Instagram users: Would you like to share your Instagram data with the Facebook mothership?” And for Facebook users, it could have a pop up (or maybe a video of Zuckerberg) explaining what the company is doing, with a drop-down menu asking “Would you like to know more?” If Facebook provides meaningful detail about what it has collected, customers are likely to find it too long to read. So is it even possible to get meaningful “consent” to the collection and use of data online?

I don’t really think so. The problem is that people share data with others for specific reasons without even considering the long-term consequences of such sharing. Yeah, they understand intellectually that someone is marketing toothpaste to them because they clicked on a picture of someone with a wicked bright smile on some internet page, and for the most part they just go on with their merry lives unaware of how they are being picked, prodded, profiled, sliced and diced. It’s the new normal. The GDPR recognizes that consent is just one model for data collection, and not necessarily the best one. The collection of the data must also be, in some communal sense, objectively reasonable. As in, not creepy. And limited—in amount, in duration, in scope and in detail. In fact, by focusing on express consent, the German agency moves away from whether the data collection is “reasonable.”

Mother and Child Reunion

The other problem with the German Cartel decision is the artificial demarcation between Facebook, Instagram and WhatsApp for data-collection and -sharing purposes. Clearly these were separate companies, and people who chose to sign up to Instagram should not have their data automatically shared with Facebook because of a corporate ledger device. Whenever there is a merger, acquisition, dissolution, divestiture or other change that materially impacts the ownership, sharing or use of data, the data subjects rights need to be addressed. If I decide to share the fact that I like Reese’s Peanut Butter Cups with my local CVS (in return for receipts that are measured by the kilometer), that doesn’t mean that I want that preference shared with a health insurer or provider because the Woonsocket, Rhode Island, company decided to merge.

On the other hand, what’s the technical difference between Facebook Messenger and WhatsApp? Both are internet messaging services, albeit with different functionalities and client bases. As Zuck explained in his Hill testimony, Facebook as an entity is not just a website, or a messaging service, or a marketplace, or a comments section. It has no direct competition because it is everything and nothing at the same time. So it’s actually difficult to define operationally where Facebook ends and WhatsApp begins. Does the same apply to all of the Alphabet service offerings, or those of AT&T or Amazon?

At the end of the day, Facebook will find a better way of obtaining and demonstrating consent to share data from the internet, WhatsApp, Instagram and other services and combining it into a massive AI platform. The overwhelming majority of people will consent to this structure. And life will go on as before. Ultimately, the German decision will be one more of form than of substance.

Unless I am wrong. In which case, I never said it, and you can’t prove it.

Mark Rasch

Avatar photo

Mark Rasch

Mark Rasch is a lawyer and computer security and privacy expert in Bethesda, Maryland. where he helps develop strategy and messaging for the Information Security team. Rasch’s career spans more than 35 years of corporate and government cybersecurity, computer privacy, regulatory compliance, computer forensics and incident response. He is trained as a lawyer and was the Chief Security Evangelist for Verizon Enterprise Solutions (VES). He is recognized author of numerous security- and privacy-related articles. Prior to joining Verizon, he taught courses in cybersecurity, law, policy and technology at various colleges and Universities including the University of Maryland, George Mason University, Georgetown University, and the American University School of law and was active with the American Bar Association’s Privacy and Cybersecurity Committees and the Computers, Freedom and Privacy Conference. Rasch had worked as cyberlaw editor for SecurityCurrent.com, as Chief Privacy Officer for SAIC, and as Director or Managing Director at various information security consulting companies, including CSC, FTI Consulting, Solutionary, Predictive Systems, and Global Integrity Corp. Earlier in his career, Rasch was with the U.S. Department of Justice where he led the department’s efforts to investigate and prosecute cyber and high-technology crime, starting the computer crime unit within the Criminal Division’s Fraud Section, efforts which eventually led to the creation of the Computer Crime and Intellectual Property Section of the Criminal Division. He was responsible for various high-profile computer crime prosecutions, including Kevin Mitnick, Kevin Poulsen and Robert Tappan Morris. Prior to joining Verizon, Mark was a frequent commentator in the media on issues related to information security, appearing on BBC, CBC, Fox News, CNN, NBC News, ABC News, the New York Times, the Wall Street Journal and many other outlets.

mark has 203 posts and counting.See all posts by mark

Secure Guardrails