Sale of Grindr Data Illuminates Privacy Blindspots

Was anyone truly shocked to learn that user data from Grindr, a social networking app for gay, bi, trans and queer men, had been collected and sold on ad networks for many years? Perhaps ‘disappointed’ would be a more appropriate description—disappointed not only that the data was sold but that freely collecting and selling it has become something consumers normalized and even expected these days.

“We all have come to enjoy having mobile apps to provide us with instant information on our smartphones—the current weather, traffic patterns, news, sports results and so on. However, it is important to note that not all apps are comparable when it comes to protecting our privacy,” said Nasser Fattah, North America steering committee chair at Shared Assessments. “In fact, some apps, in order to best operate and provide the best value, do ask permission to access personal information, including your contacts and location.”

It may be easy to dismiss these intrusive scenarios as simply the cost of working and living in an increasingly interconnected world where data flows freely. But privacy lapses or blindspots raise security issues.

“Rightfully so, many people fear that hackers may steal their PII from their service provider, but who needs a data breach when the app provider themselves is selling user’s personal information in the interest of corporate profits?” noted Token CEO John Gunn.

“Privacy is a proactive first step for security, as threat avoidance is much easier and less costly than mitigation and remediation,” said Rajiv Pimplaskar, CEO at Dispersive Holdings, Inc. “Most sensitive information has a long tail and Grindr user data exchanging hands across third parties can be subject to misuse and be dangerous.”

In Grindr’s case, the data came from a digital advertising network and had been for sale for at least five years—despite Grindr moving to prevent location data from going to ad networks two years ago, the WSJ reported.

As WSJ found, the Grindr data was used in a demonstration to U.S. government agencies to show the intelligence risks inherent in commercially available information. While names and phone numbers were not part of the data collected and sold, the information was in some cases specific enough to infer “… things like romantic encounters between specific users based on their device’s proximity to one another, as well as identify clues to people’s identities such as their workplaces and home addresses based on their patterns, habits and routines, people familiar with the data said,” the WSJ reported.

“It is worth remembering that if one is granting access permission to an app, it’s important to know if the app is sharing any information with third parties,” said Fattah. “Thus, for starters, before installing any app, users should make sure they understand what information is being collected and shared with third parties; information that is often available in the privacy policy of the app. They should limit app permissions to only data that makes sense, and make sure to download apps from official app stores.”

The Grindr report is a sad reminder that U.S. privacy expectations and regulations have lagged far behind those of Europe. “In America, we haven’t even touched the level of data privacy and data governance that the Europeans have with their sweeping GDPR mandate of 2016,” said Garret Grajek, CEO of YouAttest. “California has enacted CCPR and CCPA and a few other states are making or looking to make similar moves, but the U.S. does not have a national law guiding data privacy. Industry will demand it to help navigate their own data privacy practices and to insure against lawsuits.”

It’s time for the U.S. companies and all organizations to up their game. “These challenges are analogous to those in the data communications world today. For decades, we have been complacent; relying on basic encryption to keep our sensitive communications private and secure,” said Pimplaskar. “New threat actors are playing a long game of ‘steal now, decrypt later’ (SNDL) which can be especially deadly with the rise of nation-state threat actors and the imminent arrival of quantum computing.”

He noted that “intelligence, military and forensic techniques focus on managed attribution which obfuscates source and destination relationships and makes traffic patterns hard to identify,” making it “very hard to detect in the first place and thereby making them harder to intercept and decrypt enhancing security as a whole.”

Avatar photo

Teri Robinson

From the time she was 10 years old and her father gave her an electric typewriter for Christmas, Teri Robinson knew she wanted to be a writer. What she didn’t know is how the path from graduate school at LSU, where she earned a Masters degree in Journalism, would lead her on a decades-long journey from her native Louisiana to Washington, D.C. and eventually to New York City where she established a thriving practice as a writer, editor, content specialist and consultant, covering cybersecurity, business and technology, finance, regulatory, policy and customer service, among other topics; contributed to a book on the first year of motherhood; penned award-winning screenplays; and filmed a series of short movies. Most recently, as the executive editor of SC Media, Teri helped transform a 30-year-old, well-respected brand into a digital powerhouse that delivers thought leadership, high-impact journalism and the most relevant, actionable information to an audience of cybersecurity professionals, policymakers and practitioners.

teri-robinson has 196 posts and counting.See all posts by teri-robinson