It's Too Late to Regain Control of Your Image

The best way to starve the surveillance state is to stop feeding it

Surveillance and Facial Recognition
 Lifewire / Hilary Allison

The only thing worse than one person trying to stuff the genie back in the bottle is a committee doing it. They’re all clamoring on top of one another, trying to jam this giant blue guy back into an itty-bitty space, thinking they have the advantage because, well, they’re a committee. Meanwhile, the genie crosses his massive arms, winks, and tells them, “It’s too late.”

The committee, or commission in this case, is a pair of U.S. Senators who want to put a moratorium on Federal agencies using facial recognition until they and a group of their smartest friends figure out how to protect citizen privacy and avoid turning America into a surveillance state.

As I said. Too late.

Privacy is dead. We are already being watched 24/7—and you willingly give up your own personal data every single day.

Genie
Once the Genie is out..... Disney

Leave My Face Alone

This latest facial recognition freak-out started when The New York Times revealed that Clearview AI scraped billions photos off public social media accounts (and other sources) to feed their image-based crime-fighting system. Basically, law enforcement feeds an image into the Clearview AI system and it immediately identifies the face and matches it with personal information about that person.

Facebook, Twitter, and others filed Cease and Desist orders to stop Clear AI’s data scraping.

Perhaps in response to this blow back, Clearview AI tries on its web site to make its intentions crystal clear. It says, in a nutshell: 

  • They use public information only
  • They’re focused on search, not surveillance
  • The system is designed to stop criminals
  • And protect the innocent
  • All their data is independently verified for accuracy
  • It’s also in full compliance with the law 

If that all reads like “trying too hard,” you’re probably right. It’s also a bit of Clearview AI trying to get ahead of what’s coming.

News that Clearview AI is working with the FBI and law enforcement agencies across and even outside the U.S., as well as concerns over law enforcement’s use of Amazon’s Rekognition facial recognition system, has prompted calls for investigations and the drafting of Senators Cory Booker and Owen Merkley’s bill, which recommends a moratorium on using Facial Recognition until their new Commission drafts guidelines and limits for its use.

Enhancing images
To find a face in movies and TV, you simply hit the "Enhance" button. Duncan Robson

Your Part in This

There was a time when it was hard to collect data and build databases. Back in the '90s, I remember trying to create databases for various product categories including appliances, technology, beauty products, and travel destinations. Most didn’t exist, and we had to painstakingly build them on our own.

Personal information for billions of people, including names, addresses, ages, relationships, schools they attended, and, most important, what they, their families, and friends look like has been shockingly easy to build and access.

By one measure, people post 350 million photos to Facebook each day. They also post at least 95 million photos a day to Instagram. Most of them are posted publicly. Of Instagram’s roughly 1 billion users, just a fraction of them are private accounts (one 2016 study estimated 300 million private accounts).

We have been sharing photos on social media for almost two decades. The rise of facial recognition technology coincided with our growing social media obsession. Starting in 2000, the National Institutes of Standards and Technology launched face recognition vendor tests. These early tests helped law enforcement and government agencies assess the efficacy and best uses for facial recognition. Four years later Facebook launched. Two years after that, Twitter.

Privacy is dead. We are being watched 24/7—and you willingly give up your own personal data every single day.

In 2014, the New York Times reported that the National Security Agency (NSA) was harvesting millions of images for its own facial recognition systems. Many of the photos came from, among other sources, social media.

As consumers, we appear to be operating on two separate but competing tracks. On one, there’s the unstoppable desire to share every part of our lives. On the other, our concerns about the abuse of our privacy and private data continue to grow. But like a train track circling the globe, these two rails never intersect. Dire stories about our loss of privacy and living in a mass surveillance state have not, in any measurable way, slowed or impeded our sharing habits.

As I see it, those who deeply fear Big Brother tend to stay off social media and other services like Google, which use our data to drive ad targeting. These people are clearly in the minority.

However, when politicians pick up the cudgel of personal privacy, they give the distinct impression of doing something.

Booker and Merkley’s “Ethical Use of Facial Recognition Act” is 15 pages of rhetoric and concern for our privacy, a lot of good questions about the use of facial recognition, and a careful effort to not actually impede law enforcement’s efforts to do its job.

So What

Perhaps Booker and Merkley will succeed in at least keeping this facial recognition tool out of FBI and other government agency hands. It will do nothing to stop billions of people around the world from sharing images of themselves, their families, and their friends. Facebook will still scan and recognize everyone in the image and many, if not most, will tag their family and friends, potentially creating more data for Clearview AI and law enforcement.

If consumers were truly worried about their privacy, they’d leave social media today and demand that local governments shut down their CCTVs and their neighbors turn off their video doorbells. I don’t see that happening. For all the complaints about Facebook and other social media, people still like (even love) and use them every day and there are those who appreciate what law enforcement can do with the quick match between a fuzzy image and a criminal.

It’s not much solace for those whose identities are stolen or are unfairly targeted by law enforcement agencies, but I have a feeling most people are still willing to make the trade-off.

The genie is never going back in the bottle, so you may want to redirect your committee efforts.

Like this column? Get more like it delivered directly to your inbox. Sign-up for Untangled, a more sensible approach to technology.