Apple’s Communication Safety in Messages feature has been rolled out to the U.K., Canada, Australia and New Zealand.
CSAM
EU Proposes New Regulation Requiring Apple and Other Tech Giants to Detect Child Sexual Abuse Material
EU proposes new legislation that would require Apple and other tech giants to detect, report and report child sexual abuse materials.
What Might be Next For Apple's CSAM Proposals?
Apple’s proposals to tackle Child Sexual Abuse Material (CSAM) have, you may have noticed, attracted some controversy. With the rollout of new features now delayed, Wired had a look a at what might come next.
It’s unlikely the company can win over or please everyone with what follows – and the fallout from its plans have created an almighty mess. The technical complexities of Apple’s proposals have reduced some public discussions to blunt, for-or-against statements and explosive language has, in some instances, polarised the debate. The fallout comes as the European Commission prepares child protection legislation that could make it mandatory for technology companies to scan for CSAM. “The move [for Apple] to do some kind of content review was long overdue,” says Victoria Baines, a cybersecurity expert who has worked at both Facebook and Europol on child safety investigations. Technology companies are required by US law to report any CSAM they find online to the National Center for Missing and Exploited Children (NCMEC), a US non-profit child safety organisation but Apple has historically lagged behind its competitors.
Apple Delays Its CSAM Scanning ‘Features’
The controversy surrounding Apple’s proposed CSAM scanning features has been noticed. The iPhone maker announced it’s delaying the rollout.
The CSAM Question No One Is Asking Apple
Why is everyone afraid to ask Apple a simple question? Since Apple contends it is only scanning your iPhone for photos you choose to upload to iCloud, why does Apple need to scan anything on your device? Apple, can scan the photos once they are on its servers. The most obvious possibility is dystopian.
Apple Is Already Scanning iCloud Mail for CSAM
Think your iCloud Mail is safe from snooping? You should think again, because since 2019, Apple has been scanning iCloud Mail for CSAM.
Dangers of CSAM Scanning: Princeton Researchers Already Warned Us
If you believe that worries over the dangers of CSAM scanning are rooted in misunderstanding, Princeton researchers have some news for you.
Apple’s On-iCloud or Is It On-Device CSAM Scan?
John Kheit digs into where Apple’s CSAM scanning is taking place, arguing that where Apple reads your files is super important for privacy.
Apple, You Broke Your Privacy Promises and Our Hearts
John sees Apple reversing its commitment to privacy, which he feels has broken the hearts of many apple fans, including his own.