SBN

Farcical Recognition

It was bound to happen – welcome to the future! 

Mom took her daughter to see a show. AI facial recognition software recognizes her and she’s unceremoniously escorted out by security. 

Her offence? 

Her employer, a huge law firm (not her) is in protracted litigation with the owner MSG Entertainment, and MSG has a policy that precludes attorneys pursuing active litigation against the company from attending events at their venues.

Although, this is rather unfair in this case of Kelly Conlon, who has never practiced law in New York, not personally been involved in litigation against MSG Entertainment. 

Of course, this isn’t something new – the use of facial recognition that is. Many airports are implementing facial recognition at their boarding gates, so you don’t need to scan your boarding pass or anything – just smile and you’re in. 

While that does make it a convenient process, it does raise many questions as to who owns the database, how is the information stored and processed, and will the information be shared with third party advertisers, allowing them to target you with ads for Melatonin once you land. 

Personally, I think a pharmacy in the air would be a good business. I’d call it Dope Air. 

But I digress. 

Many event organisers are looking towards facial recognition. Ticketmaster has invested in Blink Identity, a startup that claims its sensors can identify people walking at full speed in about half a second. In doing so, Ticketmaster has aspirations of removing tickets altogether. It remains unclear as to who owns the pictures of event-goers and how long the images will be kept on file. 

In 2018, Taylor Swift stirred up some bad blood when it was revealed her security team was using facial recognition at concerts to identify potential stalkers. If you’re paying for security, they should be able to shake off stalkers without resorting to intrusive surveillance tech. 

Facial / farcical recognition is slowly spreading, we don’t know what kind of applications will be created, or the long term implications on individuals privacy. Some uses will be fun or novel, and others won’t be so much so. 

Dries Depoorter created The Follower, which uses open cameras and AI to find out how an instagram photo is taken. 

If that’s something someone can pull together themselves with a bit of time and coding, imagine how much information could be captured by governments. 

But governments may be the least of our worries. 

As the case of Kelly Conlon highlights – if you think governments with their hands on AI and facial recognition is a dangerous thing … just wait until private corporations roll them out! 

*** This is a Security Bloggers Network syndicated blog from Javvad Malik authored by j4vv4d. Read the original post at: https://javvadmalik.com/2022/12/28/farcical-recognition/