Apple and CSAM Updates – TMO Daily Observations 2021-08-09

Today’s show is bursting with guests and info as Bryan Chaffin, Charlotte Henry, AND Andrew Orr join host Kelly Guimont to discuss Apple’s CSAM changes and how this affects current iOS users.

Get In Touch:

3 thoughts on “Apple and CSAM Updates – TMO Daily Observations 2021-08-09

  • I don’t think there’s anyone complaining about CSAM measures. Detractors rightly point out that whenever evil is being implemented, it’s always sold as “to protect the children”. I believe it was one of the reasons FBI wanted backdoors to all iPhones – to protect children.
    Nobody complained when Apple started scanning “encrypted” messages for “illegal” activity.
    Nobody complained when phones gained the ability to read any text in any image, even handwritten text.
    Nobody complained when Apple said, what’s on your phone is private, but anything in iCloud, is at most, a warrant away.
    Nobody complained when Apple said it would comply with local laws.
    [Encryption of any kind is illegal where I live, the evil empire of… Australia. What must Apple be doing to comply with those laws?]
    FINALLY somebody looked at these fascinating features and asked, what could a malicious government do with these at the flick of a pen?
    Apple -could- make a phone that gave us location data without passing it on to anyone else. It chooses not to.
    I’m about done with tech that could serve -us-, but helps itself to more detail about our lives as every year passes.
    I think Bryan was coming to his senses toward the end of the discussion. This is not a healthy development. And the right people are sounding warnings. Take heed.

  • I’m not entirely in agreement here; I see no difference in scanning & reporting on a location/bluetooth intersection history library or a photo history library here.
    Photos scans our libraries with AI ML models anyway and we love the feature, Apple have just added a “known CSAM” model to the “fruit” or “car” models.
    The child-user alert notification I’m a fan of, to me this is the implicit content rating we should have augmenting content producer metadata. I’d be very happy with this filter covering any sexually explicit images not just CSAM.
    The idea of reporting CSAM images to the authorities – personally I’m OK with. I understand this is against a database of known CSAM images, presumably to curb the distributors of the images rather than the creators.

  • Kelly et al:

    Great discussion. I sense that Bryan and Charlotte were discussing two different issues that mightn’t share the same triggers, hence the discrepant responses.

    There might be an upcoming discussion on this topic in a future column, not saying by whom, given that there is as much to say about what Apple have done as there is about how the community reacted, that is simultaneously heartening and disturbing – a curious sensation to be sure; a social scientist’s dream. And there has been no discussion about potential mitigation and safeguards. So much material. So little time.

    Thanks, guys!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

WIN an iPhone 16 Pro Max!