Privacy still eroding on National Data Privacy Day

On National Data Privacy Day, we find little has changed in what numerous privacy advocates and experts have called “the golden age of surveillance.”

Privacy still eroding on National Data Privacy Day

Last year was supposed to be the year that the civilized world would start—however tentatively and incrementally—to resurrect consumer privacy from the dead.

After all, on January 2019’s National Data Privacy Day, the European Union’s General Data Protection Regulation (GDPR) had been in effect for eight months, carrying the threat of crippling fines against companies that compromised or misused their customers’ private data.

A steady stream of states in the U.S. were launching their own privacy bills. The most prominent, the California Consumer Protection Act (CCPA), took effect this month. Two others, in Maine and Nevada, have been signed into law, but more than a dozen others have been proposed or are pending.

And there were some big—or at least big-sounding—penalties in 2019 for data privacy violations. In July, the Federal Trade Commission (FTC) hit Facebook with a $5 billion fine in connection with the Cambridge Analytica scandal, in which the social media giant sold data on 87 million or so of its users to the now-defunct British data analytics company.

The U.K.’s data protection authority fined hospitality giant Marriott $123 million for a data breach that compromised the personal data of as many as 383 million guests. And British Airways was ordered to pay a $230 million fine for violations of the GDPR, after a hack that compromised the data of about 500,000 customers.

There were some big—or at least big-sounding—penalties in 2019 for data privacy violations.

Consumer privacy has also generated some congressional theater, with members of the Senate and House figuratively pounding the table at hearings, in outrage over the collection, “sharing” (selling), and lack of security of that data.

An end to the golden age of surveillance?

So, what’s the reality? Are legislation, consumer outrage, and class-action lawsuits starting to erode what numerous privacy advocates and experts have called “the golden age of surveillance” in both the public and private sectors?

Not so much.

First, it still isn’t that painful to violate privacy laws, even when you get caught. Those fines levied last year sound eye popping to average people, but to the corporations, they aren’t. Facebook is worth an estimated $633.5 billion. A $5 billion fine, even if it’s a record amount, doesn’t even crack 1% of its value—hardly a rounding error in its bottom line. A (minor) cost of doing business.

And the settlement money doesn’t even go to the Facebook subscribers whose information was abused. It goes to—you guessed it—the U.S. government.

Beyond that, many companies have scrambled to get into technical compliance with GDPR and other privacy laws. Their efforts are most visible to consumers in the pop-ups on web pages that inform users that by continuing to use the site, they have consented to being tracked. But the massive and ubiquitous collection of data seems to have increased, rather than slowed.

Consumer exploitation ‘out of control’

Earlier this month, the Norwegian Consumer Council (NCC) issued a 185-page report on mobile app data collection titled Out of Control: How Consumers Are Exploited by the Online Advertising Industry.

As the report put it, “every time we use our phones, a large number of shadowy entities that are virtually unknown to consumers are receiving personal data about our interests, habits, and behavior.”

Those “shadowy entities” are the “digital marketing and adtech industry,” which NCC said uses the collected data to “create comprehensive profiles about individual consumers … [that] can be used to personalize and target advertising, but also for other purposes such as discrimination, manipulation, and exploitation.”

“Every time we use our phones, a large number of shadowy entities that are virtually unknown to consumers are receiving personal data about our interests, habits, and behavior.”

The NCC’s study focused on 10 popular mobile apps, including those used for dating and for women to track their periods, which would, as the report noted, “generate highly personal data about sexuality, drug use, political views, and more,” not to mention location, age, and gender.

Surveillance being normalized

Then there was the Washington Post story just last month on how colleges and universities are surveilling their students, supposedly for nonintrusive reasons like helping students to be more responsible and to raise their GPAs.

But some students and professors note that it is also likely to train them “to see surveillance as a normal part of living, whether they like it or not.”

One student observed on a message board, “Building technology was a lot more fun before it went all 1984.”

Which was, of course, 36 years ago. But it looks increasingly like George Orwell was right—he was just off by a few decades.

Students are being trained “to see surveillance as a normal part of living, whether they like it or not.”

‘Good’ companies still under fire

Even Apple, the corporate computer giant that has sought to make privacy one of its main selling points (and which got some laudatory press this month for designing phones that even the company can’t unlock, never mind law enforcement), is getting heat from global privacy advocates.

Apple came in at No. 6 on Slate’s list of the 30 most evil companies in tech. Blogger and activist Cory Doctorow, who participated in the survey, noted that “Apple won’t spy on you for ads, but they’ll help the Chinese government spy on its citizens to keep its supply chain intact.”

“Apple won’t spy on you for ads, but they’ll help the Chinese government spy on its citizens to keep its supply chain intact.”

Federal data privacy initiatives immobile

Meanwhile, there are intermittent initiatives at the federal level to restrain the world of surveillance. But so far, they aren’t going anywhere.

Sens. Mark Warner, D-Va., and Josh Hawley, R-Mo., introduced the DASHBOARD Act (Designing Accounting Safeguards to Help Broaden Oversight And Regulations on Data Act) last June. It was introduced and referred to committee on June 24. Since then, nothing.

Sen. Ron Wyden, D-Ore., introduced the Mind Your Own Business Act of 2019 on Oct. 17. It was referred to committee. Since then, nothing.

And while those bills are going nowhere, data collection continues to get more sophisticated and more intrusive.

The end of anonymity

The New York Times reported just this past week on a “tiny” company, Clearview AI, that has created a facial recognition tool that more than 600 law enforcement and intelligence agencies are using that “could end your ability to walk down the street anonymously.”

The way it works: “You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared,” thanks to a database of “more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites.”

Law enforcement officials said they have used the tool to solve crimes, including child sexual exploitation. But of course, “the tool could identify activists at a protest or an attractive stranger on the subway, revealing not just their names but where they lived, what they did and whom they knew,” the Times said.

Clearview AI has created a facial recognition tool that “could end your ability to walk down the street anonymously.”

Why is data privacy still in decline?

Isn’t this the kind of stuff that privacy laws were supposed to prevent?

Perhaps, but the reality is that so far, they don’t. Sammy Migues, principal scientist at Synopsys, said Clearview AI and other examples demonstrate that data collection and surveillance are not simply getting worse; they’re getting “way, way worse. Privacy is not improving, even incrementally, and there is no expectation that it will,” he said.

“It’s almost game over for privacy. This [Clearview facial recognition] has been possible for years, and almost certainly implemented—this is simply the first time we’re hearing about it.”

Why, with so much attention and legislation focused on more privacy, is the trend line still toward less privacy?

Consumers don’t understand—or don’t care?

Perhaps part of the reason is that while consumers tell pollsters they care about data privacy, they don’t seem to demonstrate it. Data privacy doesn’t generate the kind of massive marches on Washington that other issues do.

A recent survey by customer data orchestration firm Tealium, titled Trust Is Golden: How Brands Can Prioritize Privacy in the Age of Data, reported that 91% of respondents said wanted strict privacy regulation and nearly two-thirds said potential privacy regulations would affect the way they vote.

Yet nearly two-thirds said they didn’t know of any regulatory changes, current or upcoming, that would help protect online privacy. Nearly 70% said they hadn’t heard of the CCPA or GDPR, and only 10% had heard of both.

In other words, most respondents admitted they hadn’t bothered to learn anything about something they professed to care deeply about, including the name of the biggest international privacy protection law in history.

Do consumers really care about data privacy?

Julian Llorente Perdigones, director of product, data privacy, at Tealium, said he thinks the disconnect is because consumers “may not understand the inherent value of the data they’re giving to companies.”

And he agreed that consumers should educate themselves about both the value of their data and their rights under privacy laws. But he said companies bear responsibility as well, to analyze whether users are interacting with consent pop-ups.

If not, “that’s your opportunity to address it with a cross-departmental privacy team comprised of marketing, legal, and developers to create an experience that puts privacy as an asset, not a barrier to service,” he said.

Legislative progress is promising

John Verdi, vice president of policy at the Future of Privacy Forum, still sees room for optimism. He points to current and pending privacy laws that he said are creating incentives through “large penalties.”

“Other aspects of the law have brought greater transparency—leading companies kicked off 2020 by posting scads of CCPA ‘do not sell my personal info’ links—most of them work inside and outside California,” he said.

He acknowledged that the U.S. still lacks a “common-sense, baseline, comprehensive, federal privacy law,” but said he believes “a bipartisan group of lawmakers can come together” and get it done.

“The multiple federal privacy bills are signs of a healthy, active legislative process,” he said. “It is common for many bills to be introduced on a topic, and then for one or two leading candidates to emerge that incorporate elements of multiple bills.”

But beware a possible data monopoly

Migues sees a more dystopian future. While he does predict a backlash against surveillance, he said he thinks it will be generational. Older people, he says, who “avoided putting all our pictures and lives on social media have a much greater degree of privacy, but it’s still just a small difference once you consider that Google will have access to millions of health records, Apple has access to everything you do on mobile, ISPs have access to everything you do online, and so on.”

Even if you don't use social media out of data privacy concerns, Google will have access to millions of health records.

“The big story is which data hoarder will be the first to get such a monopoly that it can blackmail or otherwise intimidate all the other data hoarders,” he said. “That tipping point will be reached soon, and even people who think the book ‘1984’ was a blueprint for progress will be awed by the power it brings. That company will effectively be a new entity that is neither company nor country nor government, but will have more power and influence than all three.”

Learn more about security solutions to help you achieve privacy compliance

*** This is a Security Bloggers Network syndicated blog from Software Integrity Blog authored by Taylor Armerding. Read the original post at: