Our Consistently Inconsistent Response to Apple and Privacy, Highlighted by COVID-19 and CSAM

Apple Privacy Day logo

Apple’s recent proposals to help tackle Child Sexual Abuse Material (CSAM) and new safety features in Siri and Messages have caused quite the stir. Particularly the first issue, which involves on-device scanning to find known CSAM. Interestingly, the general outcry was significantly less when Apple partnered with Google to help create a structure for COVID-19 contact tracing apps which track location.

To be clear, I (obviously,) appreciate that the technology used is not the same, nor are the issues they are trying to counter. However, there are enough similarities in the two cases to make a valid comparison, not least that surely we can all agree that both CSAM and COVID-19 are, to put it mildly, bad things that we would like to reduce or eliminate. Furthermore, technology is likely to be a key part of tackling these issues. That which has been deployed thus far does though in some ways infringe on participating users’ privacy. You may think that in one or both instances the loss of privacy is worth it, but it is happening.

Apple’s Role in Tackling COVID-19 and CSAM

I should also add that my view of the significance of the COVID-19 exposure notification technology is absolutely influenced by being in England. The app here, built on the back of Apple and Google’s collaboration, uses Bluetooth to monitor if someone has been in the vicinity of another person who has tested positive. If so, it warns them and asks them to isolate for up to 10 days. In this country, over 2.1 million self-isolation alerts were sent during July alone. The app has taken on a key role here, even though it has been confirmed that the alerts are not legally enforceable, unlike other parts of the NHS Track and Trace system which are.

As for the new CSAM provisions, Apple will look to identify known CSAM in iCloud Photos during the upload process using a scanning algorithm and hash database installed on device. This is done locally and only involves images headed for iCloud Photos. The COVID-19 exposure logs are also stored locally (there was a row about uploading them in England as Apple said this violated its rules).

While some people did raise concerns and questions about the COVID-19 API released via Apple and Google and some of the apps, the initiatives were broadly accepted. It is arguable that this made Apple’s other recent changes easier for the company to implement.

Consistently Inconsistent

We live in an age where massive tech firms like Apple influence policy as much as any lawmaker could wish to. The responses to both COVID-19 and CSAM show this, with tech firms helping provide the tools and infrastructure to support the work of public health and law enforcement agencies. While we welcome other privacy tools like Ad Tracking Transparency and Mail Privacy Protection, our response around its involvement in major issues would benefit from being more consistent.

One thought on “Our Consistently Inconsistent Response to Apple and Privacy, Highlighted by COVID-19 and CSAM

  • Charlotte:

    Thank you for a thoughtful analysis and discussion. And, you are correct, there are inconsistencies in the public response to both instances of what many will view as an invasion of privacy. It is important to note, however, that notions of entitled privacy vary by culture and society; with many more communal societies defining those boundaries quite differently than more individualistic Western societies. Privacy is often conflated with notions of individual liberty, so I am going to use them together. Indeed, many Asian and so-called traditional societies view elements that in the West would be seen as individual liberty and privacy as antisocial, an unwillingness to carry one’s share or do one’s part for the community, unpatriotic or simply arrogant (catchwords often used in the local press). As a case in point, in one location where I worked, when I initially brought my own lunch to work and took it in my office so that I could have some quiet time, my colleagues were deeply offended. Meals were to be taken together by all at the same time, in the common mess, irrespective of how close we were. A Westerner might view this as an oppressive imposition, whereas in this setting, it was viewed as simply being respectful of your neighbour and workmates as a function of one’s social obligations. Similarly, wedding invitations and other social events are simply not refused. Every invitee attends. Full stop. The point being that there is not a common privacy or liberty standard. For those whose activities are global, this can be a challenge.

    And herein lies, at least in part, the reason for our inconsistent behaviour. Even in highly individualistic Western societies, the boundaries we set for discretionary participation or abstention vary according to our perceptions of shared peril; a situation which imposes a common bond such that, the actions of one have a direct and unacceptably adverse effect upon another. In some cases, the perception of adverse effect is cultural (eg perceived social loss of face if one’s colleague fails to attend a wedding party); in other cases, it may be universal (the obligatory sharing of food amongst stranded survivors awaiting rescue; obeying traffic lights). In the UK, many of the same persons who might feel their privacy or freedom of choice unacceptably invaded by the COVID-19 exposure tracking app, would not hesitate to be tracked (both personally and their loved ones) for the purpose of evacuation due to pending disaster. The same persons who might feel it is their individual choice and a private matter to take a vaccine, would likely not hesitate to turn off their lights, as did their parents’ and grandparents’ generation, in the face of night-time aerial assault, and would take umbrage with any neighbour who failed to do so, thereby exposing the entire neighbourhood to peril. Individualism yields to the sense of shared peril or harm. 

    It is when we do not share a sense of peril or harm, or the peril applies to a limited group with whom we do not identify (eg a sub-population at high risk of a communicable disease, risk factors that we do not have), that we will might feel imposed upon. Gun owners in the US, for example, who feel competent to safeguard their firearms and use an assault rife responsibly might feel their individual liberty and safety is impinged if their assault rifle is banned, and may not identify with shooting victims, whom they perceive as simply not knowing how to protect themselves as well as they do. 

    The notion of shared peril, where all are equally vulnerable, whether physical or social, and how that notion is shaped by culture and circumstance, strongly influences our perception of privacy invasion and infraction upon individual liberty. Hence, the inconsistent response. This is not objective. It is subjective. 

    Finally, in no society or culture do people appreciate having no choice, unless it is something that they are socialised to accept as the norm (eg obeying the law, paying taxes, grooming and dress). It is human nature to prefer to do things by consent, even if the choice is a foregone conclusion and a mere formality. Big tech should always default for ‘opt-in’ for anything that impinges on our privacy and sense of liberty to choose, allowing our consent to be the gatekeeper to any all personal matters, but especially our data. And if there is to be an exception, then they should explain why, and justify it in terms of our shared peril. Failing that, there will be resentment and pushback, even if not universally. 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.