Why Apple’s Image-Scanning Tech Isn’t at All Private

User privacy is at risk

Key Takeaways

  • Apple’s new policy against child sexual abuse material has cause controversy among users and privacy experts. 
  • The technology works by scanning images in iCloud for CSAM and using machine learning to identify explicit photos in Messages. 
  • Experts say no matter how private Apple says its scanning technology is, it still ultimately allows a back door to be open where anything could happen.
Fish-eye view of the Apple logo.

James D. Morgan / Getty Images

Apple recently introduced a new technology to spot child sexual abuse material (CSAM), but it’s getting more criticism than praise from the privacy community. 

Although Apple has been previously hailed as one of the only Big Tech companies that actually care about user privacy, the new CSAM-scanning technology introduced last week is putting a big wrench in that. Experts say even though Apple promises user privacy, the technology will ultimately put all Apple users at risk. 

"Apple is taking its step down a very slippery slope; they have fleshed out a tool which is at risk for government back doors and misuse by bad actors," Farah Sattar, the founder and security researcher at DCRYPTD, said to Lifewire in an email interview. 

Apple’s Plan Isn’t Private 

The new technology works in two ways: first, by scanning an image before it is backed up to the iCloud—if an image matches the criteria of CSAM, Apple receives the data of the cryptographic voucher. The other part uses on-device machine learning to identify and blur sexually explicit images children receive through Messages. 

"Apple is taking its step down a very slippery slope; they have fleshed out a tool which is at risk for government back doors and misuse by bad actors."

Experts are apprehensive about the Messages feature since it would effectively end the end-to-end encryption (E2EE) that Apple has championed. 

"Apple’s introduction of client-side scanning is an invasion of privacy as this effectively breaks E2EE," Sattar said. 

"The purpose of E2EE is to render a message unreadable to any party excluding the sender and recipient, but client-side scanning will allow third parties to access content in the event of a match. This sets the precedent that your data is E2EE…until it’s not."

While Apple said in a recently published FAQ page addressing people’s concerns over its new policy that it won’t change the privacy assurances of Messages, and won’t gain access to communications, organizations are still wary of Apple’s promises. 

"Since the detection of a ‘sexually explicit image’ will be using on-device machine learning to scan the contents of messages, Apple will no longer be able to honestly call iMessage "end-to-end encrypted," the Electronic Frontier Foundation (EFF) wrote in response to Apple’s policy. 

"Apple and its proponents may argue that scanning before or after a message is encrypted or decrypted keeps the ‘end-to-end’ promise intact, but that would be semantic maneuvering to cover up a tectonic shift in the company’s stance toward strong encryption."

Closeup of someone reading messages on a smartphone.

Westend61 / Getty Images

Potential for Misuse 

The primary worry of many experts is the existence of a backdoor that, no matter what Apple may claim, is still open to potential misuse.

"Though this policy is meant to only be applicable to users under 13, this tool is also ripe for misuse as there is no guarantee that the user is actually under 13. Such an initiative poses a risk for LGBTQ+ youth and individuals in abusive relationships as it may exist as a form of stalkerware," Sattar said. 

EFF said that the slightest bit of external pressure (particularly from the government) would open the door for abuse and pointed to instances of it already happening. For example, EFF said technologies built initially to scan and hash CSAM have been repurposed to create a database of "terrorist" content that companies can contribute to and access to ban such content.

"All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts," EFF said. 

Edward Snowden even condemned Apple’s new technology as a "national security issue" and "disastrous," and his organization, Freedom of the Press Foundation, is one of many that have signed a new letter calling on Apple to end this policy before it even begins.

The letter has been signed by more than 7,400 security and privacy organizations and experts, calling on Apple to halt this technology immediately and issue a statement reaffirming the company’s commitment to end-to-end encryption and user privacy.

"Apple's current path threatens to undermine decades of work by technologists, academics, and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases," the letter reads. 

Time will tell how Apple plans to implement this technology despite the massive controversy surrounding it, but the company’s claims on prioritizing privacy will most certainly never be the same. 

Was this page helpful?