What Might be Next For Apple's CSAM Proposals?

· Charlotte Henry · Link

Dangers of CSAM Scanning

Apple’s proposals to tackle Child Sexual Abuse Material (CSAM) have, you may have noticed, attracted some controversy. With the rollout of new features now delayed,  Wired had a look a at what might come next.

It’s unlikely the company can win over or please everyone with what follows – and the fallout from its plans have created an almighty mess. The technical complexities of Apple’s proposals have reduced some public discussions to blunt, for-or-against statements and explosive language has, in some instances, polarised the debate. The fallout comes as the European Commission prepares child protection legislation that could make it mandatory for technology companies to scan for CSAM. “The move [for Apple] to do some kind of content review was long overdue,” says Victoria Baines, a cybersecurity expert who has worked at both Facebook and Europol on child safety investigations. Technology companies are required by US law to report any CSAM they find online to the National Center for Missing and Exploited Children (NCMEC), a US non-profit child safety organisation but Apple has historically lagged behind its competitors.

The CSAM Question No One Is Asking Apple

· John Kheit · Devil's Advocate

Apple's Spying Eye

Why is everyone afraid to ask Apple a simple question? Since Apple contends it is only scanning your iPhone for photos you choose to upload to iCloud, why does Apple need to scan anything on your device? Apple, can scan the photos once they are on its servers. The most obvious possibility is dystopian.