Apple Cancels Plans for CSAM Detection Tool for iCloud Photos

Dangers of CSAM Detection Scanning tool

On Wednesday, Apple announced that it won’t be moving forward with its previously proposed child sexual abuse material (CSAM) detection tool for iCloud Photos. The controversial tool would have allowed Apple to check iPhones, iPads, and iCloud Photos for CSAM. But critics objected to the feature because of its serious implications on privacy.

Apple Not Moving Forward With CSAM Detection Tool for iCloud Photos

In a statement sent to Wired, Apple said that there are other ways of protecting children that won’t require companies to comb through personal data. Cupertino gave the assurance that it would continue working with governments, child advocates, and other companies to help protect young people, and preserve their right to privacy. Hence, by doing so, Apple hopes to make the internet a safer place not only for children but for everyone as well.

Looking back when Apple first announced the CSAM tool for iCloud photos in 2021, it hoped to help combat child exploitation and promote safety. These are urgent issues that the tech community wanted badly to address. However, after receiving a wave of criticism from various privacy advocates, Apple had to put a halt on the implementation of the CSAM tool for iCloud. According to Apple, it would “take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple Is Still Concern with CSAM

Wednesday’s announcement indicates Apple is abandoning the implementation of the tool, at least for now. However, that’s not to say that Cupertino is completely neglecting safety advocacy. Instead of the CSAM tool, Apple is refocusing on its Communications Safety feature first made available in December 2021. Said Communication Safety tool would be an opt-in parental control feature that will warn minors and their parents when incoming or sent attachments in iMessage are sexually explicit. If so, Apple will blur them out.

Apple’s recent statement about its plans to not forward with the CSAM tool for iCloud came alongside its announcement of various new security features. More particularly, Apple said that it will be bringing end-to-end encryption of iCloud data. These would include backups, photos, notes, chat histories, and other services. The feature, called Advanced Data Protection will allow users to keep certain data secure from hackers, governments, and spies.

Apple’s Advanced Data Protection, however, also got some negative reactions from the government, particularly the FBI, which expressed its deep concern about the new feature.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.