AdGuard: 'People Should be Worried About Apple CSAM Detection'

Adblocking company AdGuard is the latest to offer commentary on Apple’s controversial decision to detect CSAM in iCloud Photos. The team ponders ways to block it using their AdGuard DNS technology.

We consider preventing uploading the safety voucher to iCloud and blocking CSAM detection within AdGuard DNS. How can it be done? It depends on the way CSAM detection is implemented, and before we understand it in details, we can promise nothing particular.

Who knows what this base can turn into if Apple starts cooperating with some third parties? The base goes in, the voucher goes out. Each of the processes can be obstructed, but right now we are not ready to claim which solution is better and whether it can be easily incorporated into AdGuard DNS. Research and testing are required.

EFF Shares Statement on Apple Scanning for Illegal Content

This week we discovered that Apple plans to localize its scanning efforts to detect child sexual abuse material. The move has been widely criticized and the Electronic Frontier Foundation has shared its statement on the matter.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.