Apple Delays Its CSAM Scanning ‘Features’

Apple Delays Its CSAM Scanning Features

It seems the outcry against Apple’s announced Child Safety Features has not gone unnoticed in Cupertino. Quietly, Apple has confirmed that the feedback related to its plans to scan iCloud Photos libraries for Child Sexual Abuse Material (CSAM) has led the tech giant to delay its plans.

CSAM Scanning Never Sounded Like a Good Idea

Once Cupertino first made the announcement, the features of its Child Safety Plan came under fire. A wide range of people and organizations criticized the proposal. The Electronic Frontier Foundation, Facebook’s former security chief, university researchers voiced major concerns. Even a large number of Apple employees argued against the plan. Adblocking company AdGuard went so far as to say it has already begun investigating how to block the CSAM features using its DNS technology.

For its part, Cupertino offered several responses. The company published with FAQs, documents, and interviews with company executives. A lot of the time, though, one piece of communication seemed to contradict another. Apple planned to roll out the features in the US in iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

Apple’s latest statement does not give a date for the final rollout of the features. Quietly, Cupertino added an update at the top of its Child Safety page acknowledging the delay. The statement only says that it is going to take “additional time over the coming months to collect input and make improvements before releasing these pretty clear important child safety features”.

Considering the dangers already identified with this technology, we can only hope that Cupertino takes this seriously. Yes, child sexual abuse is one of the greatest evils humanity has ever come up with. However, we must address that critical concern thoughtfully, with an eye to how the wrong people could abuse such measures.

If Cupertino is going to backtrack on its privacy stance and scan photos for CSAM, the iPhone maker absolutely needs to be 100 percent sure that its technology cannot be modified and used for other purposes.

2 thoughts on “Apple Delays Its CSAM Scanning ‘Features’

  • Delay is a way to push it out of people’s mind. Apple needs to get this OFF YOUR DEVICE. It’s fine if they scan on iCloud, on their server, but they MUST GET IT OFF YOUR DEVICE.

  • It has astounded me how BADLY Apple handled this. It was bungled from day one. The design was wrong. the announcement was wrong. The clarifications were wrong. If there is a course in how to NOT handle corporate communications, this should be one of the examples. The idea of clamping down on inappropriate and even criminal images and material is laudable. This was totally botched rom day one. For starters, whoever decided it would be a good idea to prescan images on our devices before they are uploaded to Apple’s servers, was an idiot. there is no way that would not have caused the uproar thar followed.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.