CSAM Law Can Force Messaging Apps Including Apple to Implement Client-Side Scanning of Messages

client-side scanning

A proposed amendment to the U.K. Online Safety Bill could affect tech companies, including Apple. The proposed amendment will require tech companies to remove child sexual abuse material (CSAM) even from end-to-end encrypted messages. This could force Apple and other tech companies to deploy client-side scanning of messages.

Proposed Amendment to the UK Online Safety Bill Highlights CSAM

The Guardian reported that a proposed amendment to the U.K. Online Safety Bill has been submitted. The amendment will oblige tech companies to detect CSAM even in end-to-end encrypted messages. These include messages on WhatsApp, Facebook Messenger, Apple’s Messaging app, and even Instagram direct messages.

The news comes as Facebook and Instagram prepare to implement end-to-end encryption despite opposition from the U.K. government. U.K. Home Secretary Priti Patel said the change in the law will bring balance to the need to protect children and maintain the privacy of online users.

Child sexual abuse is a sickening crime. We must all work to ensure criminals are not allowed to run rampant online and technology companies must play their part and take responsibility for keeping our children safe. Privacy and security are not mutually exclusive – we need both, and we can have both and that is what this amendment delivers.

Proposed Amendment Requires Companies to Find CSAM and Stop It

The proposed amendment will empower the Ofcom watchdog, the communications regulator of the U.K. It can demand tech companies deploy or develop new technology that can help find abusive materials and stop it.

The amendment tightens an existing clause in the bill which already gives Ofcom the power to require deployment of “accredited technology”. The change will now require companies to use their “best endeavours” to deploy or develop “new” technology if the existing technology is not suitable for their platform.

Client-Side Scanning Takes the Spotlight

Vetting messages for child abuse materials could pose some difficulty. As it is, the only known method that the Ofcom watchdog can look into is client-side scanning. Speaking of which, Apple initially planned to roll out client-side scanning of messages. Apple’s plan to fight CSAM involved scanning messages before uploading them to the cloud.

But Apple has delayed the implementation of this plan due to concerns raised about the privacy of users. The proposed amendment, if approved, could force other tech companies to implement client-side scanning similar to Apple’s to detect CSAM. Tech companies will have no choice once the bill becomes law. Failure to comply with the law will incur a fine of up to 18 million pounds ($21.5 million). Or, the fine could be 10% of a company’s global annual turnover, whichever is higher.

One thought on “CSAM Law Can Force Messaging Apps Including Apple to Implement Client-Side Scanning of Messages

  • Arnold:

    I suspected that we had not seen the back of this subject, and that the next iteration would involve legislators somewhere insisting on just this type of client-side scanning solution. The UK is likely the vanguard of a global movement. 

    I also suspect that Apple foresaw this inevitability, and rather than fly against strong user-side headwinds, opted to just let this issue reboot with the force of law for the entire industry. 

    Privacy advocates, should they feel that they have constitutionality on their side in their respective countries, will now have to take it up in the courts. 

    This is happening. 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.