Facebook, Instagram Threaten Kids’ Digital Privacy

Social media has changed how we communicate, but it has also transformed the meaning of digital privacy. Because of social media, internet users today have more ways than ever to present themselves online. But at the same time, their personal information is almost constantly tracked, collected, packaged and sold by social media companies.

Earlier this year, when Facebook announced that a new Instagram for kids under 13 was in development, privacy experts, child welfare groups and parents were rightly incensed. Even mature adults can find it difficult to trust Facebook, navigating confusing privacy policies and settings to protect their identity online from hackers and malicious actors. How are kids expected to do any better? Why should we trust Facebook to be well-intentioned?

It’s a lucrative business. And there may be no more lucrative demographic for it than teens. Today’s tech-savvy teens are some of the most avid consumers of social media; fully 76% of teens between the ages of 13 and 17 use Instagram, while 75% use Snapchat and 66% use Facebook.

A lot has changed in the intervening months, however. Instagram has assured the world it’s going to do things differently for kids under 13, and has even released new privacy and security features for teens as a sign it’s willing to take action to protect the privacy and security of young people.

But here’s the thing: Instagram’s new features, like making all accounts for teens private by default, simply aren’t enough. That’s because the biggest threat to user privacy and security doesn’t come from other users; it comes from the platform itself. Social media companies make money by exploiting the personal data of their users. Kids will be no exception, and that’s precisely the problem.

It’s no secret that social media networks rely on advertising for revenue. But ads just aren’t what they used to be. Social media companies today employ data scientists and software engineers to develop incredibly effective algorithms to deliver highly targeted ads to users. And these targeted ads work because they use vast amounts of personally identifying information, behavioral data and usage data to market products and services to precisely the right customer base.

But that’s not all. These same companies purposefully design their apps to be as addictive as possible. These companies don’t just want your data; they want you to give as much of your data to them as possible, and they will get you hooked on their apps to make sure you do so.

Small wonder, then, that social media use is associated with a host of poor mental health outcomes, especially in teens. Addiction doesn’t make you feel good. And it’s all the more sinister when addiction is being used to turn a profit at your expense.

For social media users, the situation is complicated by the fact that social media companies are notoriously non-transparent about the risks of data retention, sharing and marketing for users of their platforms. It’s incredibly difficult, if not impossible, for most ordinary users to determine exactly when, where and how their data is being collected by the social media sites they use. Even deleting your account isn’t a surefire way to stop your data from being taken, especially if that data has already been spread across the web to third parties.

People are right to be concerned about their privacy when they use social media. And it’s probably in response to these genuine privacy concerns that Instagram has vowed to limit how advertisers can target teens on the app.

But until Instagram also limits how it collects data and becomes a lot more transparent with young people and teens about the privacy downsides of using the app, simply limiting the data advertisers can use isn’t enough. Teens and children will still be exposing themselves, at a very young age, to a degree of privacy risk that is truly massive.

What’s more, there’s still no good way for Instagram to prevent kids from lying about their age and circumventing the new privacy and safety features entirely. There’s talk about using sophisticated AI to snoop on certain Instagram accounts to determine if kids are lying. But that just means even further compromising the privacy of younger users.

Facebook’s move to roll out an Instagram for kids, despite all the pushback, should be a wake-up call. Social media companies have a business model built around exploiting user data for profit. We need more than just quick fixes and a few new security features; we need comprehensive privacy protections, and we need a legal framework for data privacy in America that respects every American’s fundamental data rights.

Otherwise, we’ll continue to see our data privacy eroded as more and more of our lives are lived online at ever younger ages.

Avatar photo

Tom Kelly

Tom Kelly is president and CEO of IDX, a Portland, Oregon-based provider of identity protection and privacy services such as IDX Privacy. He is a Silicon Valley serial entrepreneur and an expert in cybersecurity technologies.

tom-kelly has 1 posts and counting.See all posts by tom-kelly