Facebook has a plan to help fight CSAM and revenge porn. TMO writer Nick deCourville joins Ken with more info. Plus – The European Commission drops its objection to Apple’s In-App Purchase requirements, but has a message on messaging.
More Communications Safety, Updates and Tweaks, and Let's Get Inked
Apple’s child protection push pushes further, updates and tweaks for Apple services, and let’s all go get tattoos!
Apple Cancels Plans for CSAM Detection Tool for iCloud Photos
Apple is abandoning its planned CSAM detection tool for iCloud photos but will refocus on its communications safety feature instead.
CSAM Law Can Force Messaging Apps Including Apple to Implement Client-Side Scanning of Messages
A proposed amendment to the U.K. Online Safety Bill Could force tech companies, including Apple, to deploy client-side scanning for CSAM.
Apple Rolls Out Communication Safety in Messages to the UK and Other Countries
Apple’s Communication Safety in Messages feature has been rolled out to the U.K., Canada, Australia and New Zealand.
EU Proposes New Regulation Requiring Apple and Other Tech Giants to Detect Child Sexual Abuse Material
EU proposes new legislation that would require Apple and other tech giants to detect, report and report child sexual abuse materials.
What Might be Next For Apple's CSAM Proposals?
Apple’s proposals to tackle Child Sexual Abuse Material (CSAM) have, you may have noticed, attracted some controversy. With the rollout of new features now delayed, Wired had a look a at what might come next.
It’s unlikely the company can win over or please everyone with what follows – and the fallout from its plans have created an almighty mess. The technical complexities of Apple’s proposals have reduced some public discussions to blunt, for-or-against statements and explosive language has, in some instances, polarised the debate. The fallout comes as the European Commission prepares child protection legislation that could make it mandatory for technology companies to scan for CSAM. “The move [for Apple] to do some kind of content review was long overdue,” says Victoria Baines, a cybersecurity expert who has worked at both Facebook and Europol on child safety investigations. Technology companies are required by US law to report any CSAM they find online to the National Center for Missing and Exploited Children (NCMEC), a US non-profit child safety organisation but Apple has historically lagged behind its competitors.
Apple Delays Its CSAM Scanning ‘Features’
The controversy surrounding Apple’s proposed CSAM scanning features has been noticed. The iPhone maker announced it’s delaying the rollout.
The CSAM Question No One Is Asking Apple
Why is everyone afraid to ask Apple a simple question? Since Apple contends it is only scanning your iPhone for photos you choose to upload to iCloud, why does Apple need to scan anything on your device? Apple, can scan the photos once they are on its servers. The most obvious possibility is dystopian.
Apple Is Already Scanning iCloud Mail for CSAM
Think your iCloud Mail is safe from snooping? You should think again, because since 2019, Apple has been scanning iCloud Mail for CSAM.