A proposed amendment to the U.K. Online Safety Bill Could force tech companies, including Apple, to deploy client-side scanning for CSAM.
Apple’s Communication Safety in Messages feature has been rolled out to the U.K., Canada, Australia and New Zealand.
EU proposes new legislation that would require Apple and other tech giants to detect, report and report child sexual abuse materials.
Apple’s proposals to tackle Child Sexual Abuse Material (CSAM) have, you may have noticed, attracted some controversy. With the rollout of new features now delayed, Wired had a look a at what might come next.
It’s unlikely the company can win over or please everyone with what follows – and the fallout from its plans have created an almighty mess. The technical complexities of Apple’s proposals have reduced some public discussions to blunt, for-or-against statements and explosive language has, in some instances, polarised the debate. The fallout comes as the European Commission prepares child protection legislation that could make it mandatory for technology companies to scan for CSAM. “The move [for Apple] to do some kind of content review was long overdue,” says Victoria Baines, a cybersecurity expert who has worked at both Facebook and Europol on child safety investigations. Technology companies are required by US law to report any CSAM they find online to the National Center for Missing and Exploited Children (NCMEC), a US non-profit child safety organisation but Apple has historically lagged behind its competitors.
- Get a Wooden Photo Print With New Offering From Mimeo Photos
- 'Fraggle Rock: Back to the Rock' Guest Stars Ed Helms, Cynthia Erivo, Daveed Diggs Play Frictionary
- Dissecting the Scene Explainer From 'Servant' Season Three Premiere Episode
- Revealed: The Full Voicemail Message Jon Hamm Left For Tom Hanks
- This Decentralized Messenger Claims to be Quantum-Resistant
- 'The Matrix Resurrections' Now Available to Rent or Buy on Apple TV
The controversy surrounding Apple’s proposed CSAM scanning features has been noticed. The iPhone maker announced it’s delaying the rollout.
Why is everyone afraid to ask Apple a simple question? Since Apple contends it is only scanning your iPhone for photos you choose to upload to iCloud, why does Apple need to scan anything on your device? Apple, can scan the photos once they are on its servers. The most obvious possibility is dystopian.
Think your iCloud Mail is safe from snooping? You should think again, because since 2019, Apple has been scanning iCloud Mail for CSAM.
If you believe that worries over the dangers of CSAM scanning are rooted in misunderstanding, Princeton researchers have some news for you.
John Kheit digs into where Apple’s CSAM scanning is taking place, arguing that where Apple reads your files is super important for privacy.
John sees Apple reversing its commitment to privacy, which he feels has broken the hearts of many apple fans, including his own.