Apple and 17 other leaders in the tech industry are working to fight the growing problem of childhood sexual abuse and exploitation.
A report today from Motherboard details how Facebook and the FBI used a zero-day exploit for privacy OS Tails to catch a child predator. The reason I’m specifically linking to it is because of this paragraph:
Facebook told Motherboard that it does not specialize in developing hacking exploits and did not want to set the expectation with law enforcement that this is something it would do regularly. Facebook says that it identified the approach that would be used but did not develop the specific exploit, and only pursued the hacking option after exhausting all other options.
That is a slippery slope argument that will be used by politicians, like how Apple does what it can to help the FBI get into terrorists’ iPhones. “But you helped them before, why not again?” More fuel on the EARN IT fire.
Andrew Orr joins host Kelly Guimont for Security Friday! Hardware flaws, This Week in Who Has Your Data, and the latest in ending encryption.
Introduced by Senators Lindsey Graham and Richard Blumenthal, the EARN It act would force companies to “earn” protection from Section 230 to fight online child exploitation.
Though it seems wholly focused on reducing child exploitation, the EARN IT Act has definite implications for encryption. If it became law, companies might not be able to earn their liability exemption while offering end-to-end encrypted services. This would put them in the position of either having to accept liability or remove encryption protections altogether.
My linked teaser from yesterday was separate from the EARN It act, but now it shows that companies are being coerced on two fronts.
Attorney General William Barr wants tech companies like Apple to fight online child sexual abuse even more with “voluntary standards.”
These voluntary principles are built on existing industry efforts to combat these crimes. Some leading companies have dedicated significant resources to develop and deploy tools in the fight to protect children online and to detect, disrupt and identify offenders. Although significant progress has been made, there is much more to be done to strengthen existing efforts and enhance collective action.
First, as I discovered last year Apple started to scan online iCloud content for child sexual abuse material (CSAM). Many other companies do the same. Second, although encryption wasn’t explicitly mentioned, this is undoubtedly (in my opinion) a new development in the war on encryption. Child predators are one of the scary boogeymen used by the government to erode our privacy even further. I of course do support Apple scanning for this content, but it’s not a black and white issue.
Andrew wrote that Apple scans uploaded iCloud content for child abuse imagery, and a search warrant reveals it scans emails too.
Microsoft has created an automated tool codenamed Project Artemis that can help detect patterns of communication used by predators to target kids.
Building off the Microsoft patent, the technique is applied to historical text-based chat conversations. It evaluates and “rates” conversation characteristics and assigns an overall probability rating. This rating can then be used as a determiner, set by individual companies implementing the technique, as to when a flagged conversation should be sent to human moderators for review. Human moderators would then be capable of identifying imminent threats for referral to law enforcement…
Microsoft was the company that also helped developed PhotoDNA, an automated tool to detect child abuse images. Now it’s moving to text.
With connected devices and voice assistants becoming more common in our households, children are seeing them as friends. Sometimes, they might even see the device as a trusted confidant. Could that encourage legislators to make the devices report child abuse? Jeff Butts has been thinking hard about that, and suggests it might not be a terrible idea.