Apple Removes Mention of CSAM Detection on Child Safety Page, Code Remains in iOS

iPhone 13 wood background

Redditor u/AsuharietYgvar spotted a change on Apple’s page that lists child safety features. The company has removed any mention of its controversial CSAM detection plans in iCloud Photos.

Update: In response to The Verge, Apple spokesperson Shane Bauer said that the company’s position hasn’t changed since September. Apple plans to move forward with the detection feature and eventually release it.

iOS 15 Child Safety

Apple announced the move in August as a feature coming in a version of iOS 15. It would detect images of child sexual abuse material (CSAM) as they are uploaded to iCloud Photos. It had to meet certain requirements such as if the image hash had been previously uploaded to a database from the National Center for Missing and Exploited Children (NCMEC).

The web page had previously said:

Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Communication safety in Messages has still be launched with iOS 15.2. The Messages app includes tools to warn children when receiving or sending photos that contain nudity. These features are not enabled by default. If parents opt in, these warnings will be turned on for the child accounts in their Family Sharing plan.

Redditor u/AsuharietYgvar had also claimed to have extracted the NeuralHash algorithm used for the CSAM detection. They also claim that the code is still present as of iOS 15.2. It remains unknown whether Apple plans to abandon the plan entirely or release it in a future version of its operating systems.

Subscribe
Notify of

This site uses Akismet to reduce spam. Learn how your comment data is processed.

2 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
John Kheit

Apple did the right thing on what no one is properly reporting on. For the nude photos in iPhone feature, it gave parents a check box to opt in (or not) for looking after their kids. It gives parents the power to do this. Well done on this part:

https://support.apple.com/en-us/HT212850

Now lets hope they kill their Orwell’s nightmare CSAM fiasco.

W. Abdullah Brooks, MD

Andrew: The global legislative fixation on encryption backdoors as an aid to law enforcement is not going away anytime soon, and child related sex-trafficking, predation and pornography will continue to serve as a cudgel against Big Tech to provide that backdoor at pain of penalty.  Whatever Apple’s response to this threat to user privacy, a principal feature of their platform, its success in preserving user privacy is heightened by it being proactive, where Apple can define the terms of that solution, rather than being reactive to terms set by a fragmented and mutually contradictory body of world legislators – a… Read more »