Apple’s recent proposals to help tackle Child Sexual Abuse Material (CSAM) and new safety features in Siri and Messages have caused quite the stir. Particularly the first issue, which involves on-device scanning to find known CSAM. Interestingly, the general outcry was significantly less when Apple partnered with Google to help create a structure for COVID-19 contact tracing apps which track location.

To be clear, I (obviously,) appreciate that the technology used is not the same, nor are the issues they are trying to counter. However, there are enough similarities in the two cases to make a valid comparison, not least that surely we can all agree that both CSAM and COVID-19 are, to put it mildly, bad things that we would like to reduce or eliminate. Furthermore, technology is likely to be a key part of tackling these issues. That which has been deployed thus far does though in some ways infringe on participating users’ privacy. You may think that in one or both instances the loss of privacy is worth it, but it is happening.

Apple’s Role in Tackling COVID-19 and CSAM

I should also add that my view of the significance of the COVID-19 exposure notification technology is absolutely influenced by being in England. The app here, built on the back of Apple and Google’s collaboration, uses Bluetooth to monitor if someone has been in the vicinity of another person who has tested positive. If so, it warns them and asks them to isolate for up to 10 days. In this country, over 2.1 million self-isolation alerts were sent during July alone. The app has taken on a key role here, even though it has been confirmed that the alerts are not legally enforceable, unlike other parts of the NHS Track and Trace system which are.

As for the new CSAM provisions, Apple will look to identify known CSAM in iCloud Photos during the upload process using a scanning algorithm and hash database installed on device. This is done locally and only involves images headed for iCloud Photos. The COVID-19 exposure logs are also stored locally (there was a row about uploading them in England as Apple said this violated its rules).

While some people did raise concerns and questions about the COVID-19 API released via Apple and Google and some of the apps, the initiatives were broadly accepted. It is arguable that this made Apple’s other recent changes easier for the company to implement.

Consistently Inconsistent

We live in an age where massive tech firms like Apple influence policy as much as any lawmaker could wish to. The responses to both COVID-19 and CSAM show this, with tech firms helping provide the tools and infrastructure to support the work of public health and law enforcement agencies. While we welcome other privacy tools like Ad Tracking Transparency and Mail Privacy Protection, our response around its involvement in major issues would benefit from being more consistent.

Subscribe
Notify of

This site uses Akismet to reduce spam. Learn how your comment data is processed.

1 Comment
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
W. Abdullah Brooks, MD

Charlotte: Thank you for a thoughtful analysis and discussion. And, you are correct, there are inconsistencies in the public response to both instances of what many will view as an invasion of privacy. It is important to note, however, that notions of entitled privacy vary by culture and society; with many more communal societies defining those boundaries quite differently than more individualistic Western societies. Privacy is often conflated with notions of individual liberty, so I am going to use them together. Indeed, many Asian and so-called traditional societies view elements that in the West would be seen as individual liberty… Read more »