Apple has removed any mention of its controversial CSAM detection plans in iCloud Photos, although the code remains in iOS.
Adblocking company AdGuard is the latest to offer commentary on Apple’s controversial decision to detect CSAM in iCloud Photos. The team ponders ways to block it using their AdGuard DNS technology.
We consider preventing uploading the safety voucher to iCloud and blocking CSAM detection within AdGuard DNS. How can it be done? It depends on the way CSAM detection is implemented, and before we understand it in details, we can promise nothing particular.
Who knows what this base can turn into if Apple starts cooperating with some third parties? The base goes in, the voucher goes out. Each of the processes can be obstructed, but right now we are not ready to claim which solution is better and whether it can be easily incorporated into AdGuard DNS. Research and testing are required.
Hao Kuo Chi, 40, of La Puente, has agreed to plead guilty to four felonies, including conspiracy to gain unauthorized access to a computer.
Apple’s recent announcement that it will add a CSAM detection system to devices has angered many. Fight For The Future created a petition.
On Tuesday Corellium announced the launch of the Corellium Open Security Initiative. It will support independent public research of mobile security.
With iOS 15 Apple will add a system to detect examples of child abuse that get uploaded to iCloud Photos. They now have an FAQ to explain it.
Does the thought of your Apple device scanning your iCloud Photos make you uneasy? Good news! Here are four private alternatives.