Apple Addresses Concerns Over New Anti-Child Abuse Measures

A new FAQ page clears some things up

Apple is explaining more of the process involved in its new anti-child abuse measures. 

The tech giant announced a new policy last week that uses technology to spot potential child abuse imagery in the iCloud and Messages. The Verge reports that Apple released a FAQ page in recent days that explains how the technology is used and what the privacy aspects look like after people voiced concerns over the new measures. 

Apple Store logo

Getty Images/Yuriko Nakao

Apple said its technology is specifically limited to detecting child sexual abuse material (CSAM) and cannot be turned into surveillance tools. 

"One of the significant challenges in this space is protecting children while also preserving the privacy of users," Apple wrote on the new FAQ page. 

"With this new technology, Apple will learn about known CSAM photos being stored in iCloud Photos where the account is storing a collection of known CSAM. Apple will not learn anything about other data stored solely on device."

The technology works by scanning an image before it is backed up to the iCloud. Then, if an image matches the criteria of CSAM, Apple receives the data of the cryptographic voucher. 

Groups such as the Electronic Frontier Foundation voiced their concerns about the technology last week, saying that the tech could be "repurposed to create a database of 'terrorist' content that companies can contribute to and access for the purpose of banning such content."

One of the significant challenges in this space is protecting children while also preserving the privacy of users.

However, Apple’s detailed FAQ page addresses some of these concerns by laying out that the tech will not scan all the photos stored on a device, will not break end-to-end encryption in Messages, and that it would not falsely flag innocent people to law enforcement. 

The Verge does note that Apple’s FAQ does not address the concerns brought up about the technology being used to scan Messages and how the company ensures that the scanning only focuses on CSAM. 

Was this page helpful?