What Might be Next For Apple’s CSAM Proposals?

Dangers of CSAM Detection Scanning tool

Apple’s proposals to tackle Child Sexual Abuse Material (CSAM) have, you may have noticed, attracted some controversy. With the rollout of new features now delayed,  Wired had a look a at what might come next.

It’s unlikely the company can win over or please everyone with what follows – and the fallout from its plans have created an almighty mess. The technical complexities of Apple’s proposals have reduced some public discussions to blunt, for-or-against statements and explosive language has, in some instances, polarised the debate. The fallout comes as the European Commission prepares child protection legislation that could make it mandatory for technology companies to scan for CSAM. “The move [for Apple] to do some kind of content review was long overdue,” says Victoria Baines, a cybersecurity expert who has worked at both Facebook and Europol on child safety investigations. Technology companies are required by US law to report any CSAM they find online to the National Center for Missing and Exploited Children (NCMEC), a US non-profit child safety organisation but Apple has historically lagged behind its competitors.

Check It Out: What Might be Next For Apple’s CSAM Proposals?

3 thoughts on “What Might be Next For Apple’s CSAM Proposals?

  • Charlotte:

    Excellent article.

    I like the Wired’s nuanced approach, citing different points of view.

    At this point, as the article states, there probably is nothing that Apple can do that will satisfy everyone, whether customers, law enforcement or legislators. Threading a camel through a needle’s eye would be child’s play by comparison. And despite the outcry, expectations remain high all around.

    While I concur with Matthew Green’s (JHU) that Apple need to consult a wider circle of expert opinion, a point that has been made more broadly in academic circles, and Turrecha‘s (Santa Clara U) point about the need for transparency of CSAM’s behaviour and findings, Green’s further point about public discussion, specifically what Victoria Baine’s (cyber security expert) described about this being ‘complicated’, with which I also concur, the latter may be too late and a missed opportunity.

    As we have seen with the pandemic and mitigation measures in real time, with the introduction of any new technology based on complex science (novel vaccines, exposure tracking), that comes with risk, defined or speculative, however great or small, once trust is compromised by whatever means, no amount of expert opinion, fact or empirical observation can re-cork the genie of mistrust, fear and rage.

    Whatever Apple’s next move, and there will be one, given the conflicting interests of customers, child protective services, law enforcement and legislators, it will be consequential.

    Apple will need to make that move more thoughtful and deliberate than any it has done to date.

  • The article is a lark and tries to put forth the issue as if there really is some tension and nuance. There is not. Apple is free to scan anything that hits iCloud servers, no one would care. It’s like everyone else is doing.

    Apple went a step further invading the privacy of YOUR device without YOUR permission or knowledge. Apple simply has to not do on-device scanning, and anything that hits it’s servers, it may scan. This is really that simple.

    The only reason it seems complicated is apple as been obfuscating what it’s doing through their bogus ‘explanations’ meant to mislead.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.