Dark Patterns: Stealth Ways Companies Collect Personal Data

Security experts and security writers talk a lot about how sophisticated cybercriminals are at getting the information they want. But what we don’t talk enough about is how well legitimate companies are at tricking consumers into revealing our personal information. New research from the Norwegian Consumer Councils found that companies including Facebook and Google are using a technique known as dark patterns to push users into revealing personally identifiable information (PII), raising questions about their compliance with GDPR and other privacy laws.

What Are Dark Patterns?

Dark patterns are designed to prod users into making choices that aren’t in their best interest, but rather benefit the company. In general terms, Ghostery UX Designer John Evans explained to me in an email conversation, dark patterns are tricks employed by designers or developers to make a user do something they otherwise would not want to do. For example, a video streaming website may have a play button that instead opens another tab—usually to an advertising partner—or a pop-up add telling you that you have won some sort of prize. Another tactic that’s become increasingly popular since the EU’s GDPR took effect is getting users to click on privacy consent notices. We think that we’re protecting ourselves, but instead, we’re turning over information.

There are three common dark patterns used by organizations, according to a UX blog post:

  • Persuasive formatting. “Through formatting text fonts, buttons, and color blocks, designers can trigger the desirable action from the user, directly or through learned associations,” the blog stated.
  • Click number and tunneling. Tunneling purposely guides the user through a specific process by blocking off other options or create a system that requires too many clicks through the system, frustrating users. Privacy settings on social media sites often rely on this tactic, defaulting users to minimal settings and forcing them to click through many pages to get to the settings they want.
  • Framing and loss aversion. The organization sets up either a positive outcome while ignoring potential negative outcomes or creates a sense of loss aversion, making the users think they are missing out on something if they don’t follow the directions.

Dark Patterns in Action

“Companies that rely on consumer data to drive revenue have a long history of using dark patterns to push them toward elections limiting control over their own information,” said Evans. “One of the main examples of this is how Facebook and Google require users to opt out of certain privacy options on their products rather than opting in. Opting out takes more effort and time than opting in.”

An additional tactic is to prevent users from choosing options that limit data collection, he continued. Companies attempt to dissuade the user from making a choice with scary-sounding messages that emphasize the user will have a lesser experience unless they share their data. These messages also purposely fail to include how the data could be shared and its business value to the company.

Evans recommended that to see dark pattern in action, log into Google and go to https://myaccount.google.com/activitycontrols and try to opt out of any of the activity controls.

“When you try,” he said, “you’ll see a second pop-up that the requires another click to fully opt out and only after that are you presented with the downsides of this choice.”

Another dark pattern example asks users to sync contacts, text messages and email across platforms to connect with friends easier. “What users often don’t know is that these contacts give companies more opportunity to sneakily promote themselves to new users,” said Evans. LinkedIn uses this pattern, which provides valuable data on your connections.

GDPR and Privacy Law Implications

Organizations must understand that they are subject to the GDPR if any of their users are EU citizens, said Evans. “GDPR requires that user consent of data collection be ‘freely given, specific, informed, unambiguous’ with a ‘clear affirmative action,’” he added. “Many dark patterns violate this definition of consent and thereby put companies that use them at risk of a GDPR violation.”

And, of course, it isn’t just GDPR that organizations have to worry about. California, Colorado, Canada and others are passing privacy laws to protect consumer personal data. Google and Facebook are just two companies being called out for their dark pattern practices, and it won’t be long until other, smaller businesses will have to answer for this stealth privacy collection workaround.

At the same time, employees and consumers should be wary about protecting their own privacy. Companies should consider deploying privacy tools that control trackers and cookie collections. Dark patterns should also be included as part of security education, to allow everyone to be more alert about the choices they make online. That harmless click on a video or not bothering to go through the steps to set up privacy could end up revealing much more than you ever intended.

Sue Poremba

Avatar photo

Sue Poremba

Sue Poremba is freelance writer based in central Pennsylvania. She's been writing about cybersecurity and technology trends since 2008.

sue-poremba has 271 posts and counting.See all posts by sue-poremba

Cloud Workload Resilience PulseMeter

Step 1 of 8

How do you define cloud resiliency for cloud workloads? (Select 3)(Required)