News > Software & Apps Apple Delays Anti-Child Abuse Technology The tech was plagued by concerns from privacy experts and organizations By Allison Murray Allison Murray Twitter Tech News Reporter Southern Illinois University Allison reports on all things tech. She's a news junky that keeps her eye on the latest trends. Allison is a writer working out of Chicago, IL, with her only coworker: her cat Norbert. lifewire's editorial guidelines Updated on September 3, 2021 02:15PM EDT Fact checked by Rich Scherr Fact checked by Rich Scherr Twitter University of Maryland Baltimore County Rich Scherr is a seasoned technology and financial journalist who spent nearly two decades as the editor of Potomac and Bay Area Tech Wire. lifewire's fact checking process Tweet Share Email Tweet Share Email Software & Apps Mobile Phones Internet & Security Computers & Tablets Smart Life Tech Leaders Home Theater & Entertainment Software & Apps Social Media Streaming Gaming Women in Gaming After a lot of pushback from critics and users alike, Apple is delaying its anti-child abuse measures. In August, the tech giant initially announced a new policy that uses technology to spot potential child abuse imagery in iCloud and Messages, but concerns followed it. Experts warned that even though Apple promised user privacy, the technology would ultimately put all Apple users at risk. Getty Images/Justin Sullivan On Friday, Apple said it would delay the rollout of the technology altogether to make improvements and fully ensure user privacy. "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," Apple said in an updated statement on its website. The child sexual abuse material detection technology was supposed to become available later this year in the iOS 15 rollout, but it’s now unclear when, or if, the feature will debut. The new technology would work in two ways: first, by scanning an image before it is backed up to the iCloud. If that image matches the criteria of CSAM, Apple would receive that data. The other part of the technology uses machine learning to identify and blur sexually explicit images children receive through Messages. However, after the new policy was announced, privacy advocates and groups said that Apple is essentially opening a back door that bad actors could misuse. To address these concerns, Apple released a FAQ page shortly after announcing the CSAM technology. Apple explained that the tech would not scan all the photos stored on a device, break end-to-end encryption in Messages, and not falsely flag innocent people to law enforcement. Was this page helpful? Thanks for letting us know! Get the Latest Tech News Delivered Every Day Subscribe Tell us why! Other Not enough details Hard to understand Submit