Photo App Pivots to Violating Its Users’ Privacy

Ever AI is accused of playing fast and loose with user privacy. An investigation alleges it’s been using billions of private photos from millions of users to train an AI facial-recognition product—aimed at enterprises, police forces and the military.

The app, formerly known as EverRoll, doesn’t get informed consent from its users, say critics. Since the story broke, the company has updated its privacy policy a little, but that’s hardly the point.

On the face of it, this isn’t a good look for Ever. In today’s SB Blogwatch, we go live in a cave, forever.

Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: Puddles and Bigfoot.


MFW I Learned: WTF?

What’s the craic? Olivia Solon and Cyrus Farivar report, “Millions of people uploaded photos”:

 “Make memories”: That’s the slogan on the website for the photo storage app Ever. … Everything about Ever’s branding is warm and fuzzy.

What isn’t obvious … is that the photos people share are used to train the company’s facial recognition system, [which Ever sells] to private companies, law enforcement and the military. … Use of photos of unsuspecting people has raised growing concerns from privacy experts and civil rights advocates.

Ever AI promises prospective military clients that it can “enhance surveillance capabilities” and “identify and act on threats.” It offers law enforcement the ability to identify faces in body-cam recordings or live video feeds.

Doug Aley, Ever’s CEO … who joined Ever in 2016, said … the company decided to explore facial recognition … when he and other company leaders realized that a free photo app … “wasn’t going to be a venture-scale business.” [He] said that having … over 13 billion images was incredibly valuable.

In the previous privacy policy, the only indication that the photos would be used for another purpose was a single line: “Your files may be used to help improve and train our products and these technologies.” … When asked if the company could do a better job of explaining to Ever users that the app’s technology powers Ever AI, Aley said no.

When [we] told Aley that some of Ever’s customers did not understand that their photos were being used to develop … technology that eventually could wind up in the government’s hands, he said he had never heard any complaints.

Uh, perhaps that’s because they didn’t know? Katyanna Quach asks why they “didn’t read the 2,566 word privacy policy?”:

 Some unsuspecting customers, who might not have read the whole privacy policy document, are shocked and concerned that their data is being used to train facial recognition systems for unknown applications. … The software doesn’t just do facial recognition or face matching, it can also, apparently, detect emotions, predict people’s ages, gender and ethnicity too.

Doug Aley, CEO of Ever AI, told [me] “No user images [nor] information derived from those images, such as vectors or mathematical representations of the images, are provided to our enterprise customers.” Instead, it looks like the data taken from Ever is being used to train Ever AI’s facial recognition models and this software is then sold to its enterprise customers.

Wait. Pause. Carrie Mihalcik and Alfred Ng ask whether the “Ever app trained facial recognition tech on users’ photos”:

 CEO Doug Aley said the … report is inaccurate and that the company isn’t taking images collected via the Ever app and using them to train facial recognition on the Ever AI side.

Except that’s exactly what your privacy policy says you do do, no? And weren’t you just boasting how the “corpus” was “valuable”?

To summarize, Graham Cluley quotes the website come-on—“Free, unlimited private backup of all your life’s memories”:

 Sound too good to be true? Well, you might be right about that. … Ever isn’t being completely altruistic.

Ever decided two-and-a-half years ago to switch its business strategy – by embracing facial recognition and exploiting the 13 billion images its users had entrusted it with. But what it doesn’t seem to have done is clearly communicate that change of path with its millions of users, and given them the choice as to whether they wished to opt in.

Its use of the private photos of customers who have not given their explicit, informed consent to augment its dataset and improve its algorithms is really heinous behaviour. … When you pay nothing, the company couldn’t care less about you.

Ouch. And this Anonymous Coward alleges an allegation:

 In other words, their terms of service was lying to people for years. … Companies will lie about anything and everything with no punishments and no recourse.

We’re way past, “Don’t store things in the cloud.”

Cue: much outrage and indignation. crunchygranola is not a lawyer:

 Looks Like Ever Owes Millions of People $$

So Ever appropriated other people’s private data (they didn’t agree to provide it for AI training) to create a commercial product. Seems like they owe all those people a major cut from their revenue.

Say, $2 to be dispersed among the users for every $1 pocketed by the company. A lawsuit seems in order.

Hold on, didn’t Ever say it obeys “all applicable laws”? Pascal Monett doesn’t seem reassured:

 Facebook is also in compliance with all applicable laws, look how well that train wreck of a privacy violator is going.

Ever wants to make its money from facial recognition, so it creates a photo-storage space and suckers people in with a free app to create a database it can use to train its statistical analysis machine. That is smart, no doubt there.

And of course, people have flocked to the thing like the sheep they are.

Baa. But Gavagai80 smells a rodent:

 This is like getting a big 20th century photo album out of your closet, handing it to someone asking them to re-organize the photos for you … and then complaining when you find out they actually looked at the photos and remember some of them. Perhaps they even gasp learned things about the world from your vacation photos which help them. Time to demand they either pay up or go into the chair for a memory wipe!

The faux-outrage is presumably just somebody’s attempt to cash in: a company made money, so now everybody wants a cut.

But Ainsley Harris—@ainsleyoc—thinks of the children:

 Journalists get permission before photographing children. But startups think it’s okay to use pictures of kids to train facial recognition tech, without bothering to ask?

Meanwhile, Grant Kruger—@gkroog1—waxes reductive:

 If you give a soulless modern company the opportunity to make money at your expense, don’t be surprised when they do.

And Finally:

I don’t even … what?


You have been reading SB Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites… so you don’t have to. Hatemail may be directed to @RiCHi or sbbw@richi.uk. Ask your doctor before reading. Your mileage may vary. E&OE.

Image source: Gerd Altmann (Pixabay)

Featured eBook
7 Reasons Why CISOs Should Care About DevSecOps

7 Reasons Why CISOs Should Care About DevSecOps

DevOps is no longer an experimental phenomenon or bleeding edge way of delivering software. It’s now accepted as a gold standard for delivering software. It’s time for CISOs to stop fearing DevOps and start recognizing that by embedding security into the process they’re setting themselves up for huge potential upsides. Download this eBook to learn ... Read More
Security Boulevard

Richi Jennings

Richi is a foolish independent industry analyst, editor, writer, and fan of the Oxford comma. He’s previously written or edited for Computerworld, Petri, Microsoft, HP, Cyren, Webroot, Micro Focus, Osterman Research, Ferris Research, NetApp on Forbes and CIO.com. His work has won awards from the American Society of Business Publication Editors, ABM/Jesse H. Neal, and B2B Magazine.

richi has 69 posts and counting.See all posts by richi