iOS 15.2 Nudity Detection Will Be On-Device and Opt-In

Parents will be able to activate the feature on their kids' iOS devices

Apple will be pushing ahead with its nudity-detecting, child protection feature in the Messages app for iOS 15.2, but parents will have to turn it on.

When Apple first revealed its child protection features, they were met with a fairly critical response, resulting in a delay of the planned roll-out. The biggest privacy concern—Apple scanning iCloud photos for Child Sexual Abuse Material (CSAM)—is still on hold, but according to Bloomberg, the Messages update is slated for release with iOS 15.2. Apple says it won't be on by default, however, and that image analysis will be happening on-device, so it won't have access to potentially sensitive materials.

Teenager using cell phone with cat on back

RF Pictures / Getty Images

According to Apple, once enabled, the feature will use on-device machine learning to detect whether sent or received photos in Messages contain explicit material. This will blur potentially explicit incoming images and warn the child or give them a warning if they're sending something that might be explicit.

In both cases, the child will also have the option to contact a parent and tell them what's going on. In a list of Frequently Asked Questions, Apple states that for child accounts 12 and under, the child will be warned that a parent will be contacted if they view/send explicit material. For child accounts between ages 13-17, the child is warned of the potential risk, but parents will not be contacted.

Child protection in Messages

Apple

In the same FAQ, Apple insists that none of the information will be shared with outside parties, including Apple, law enforcement, or the NCMEC (National Center for Missing & Exploited Children).

These new child safety options for Messages should be available in the upcoming iOS 15.2 update, which is expected to roll sometime this month, according to Macworld.

Was this page helpful?