The tech world recently has had its share of controversies and unflattering revelations about products and corporate decisions. These range from the health of FB’s platform, to Apple, Google, social media and slavery, FB/Ray Ban glasses, Instagram for Kids and Apple’s CSAM prevention technology. Though each is different, what they share in common is either adverse impact on users and society (FB or slavery), or a potential violation of expectation (CSAM). The net effect of these stories raises concerns about the tech we use, and how it might create harm both to ourselves and to others. Where there are differences of opinion, or lack of information, clarity is essential to our being able to protect ourselves.
In public health, we encounter controversies around mitigation measures and strategies in containing a pandemic, often against a backdrop of honest differences of opinion, disinformation and outright lies and conspiracy theories. The public theater in which these play out is a global population in which a majority lack the educational and technical background to critically assess the merits of these controversies. This is less the case in the tech world at sites like TMO where the majority of readers are engaged, tech-savvy consumers. Nonetheless, it is important even in these venues to take a health check, and discuss best practices in information hygiene; getting the facts, and using facts to assess benefits and risks, and what we can do to minimize risk. This is as true of tech products and services as it is of pandemics and plagues.
With respect to tech products/services, we need perspective on three distinct but inter-related subjects; the producing corporation, the benefit/risk ratio associated with that product/service, and what we can do to optimize any benefits vs risks.
Let’s take those in order.
The Corporation and Managing Expectations
All corporations have a track record, which covers corporate philosophy, business model, partnerships, product range and focus, user satisfaction and social impact. Some of this is subjective, but much of it is either objective or consensus derived. Most of us will already have a sense of that corporation, based on prior experience, mindful that this can change over time. There is a tendency, particularly in the Apple community, to assign aspirational value, beyond the products, to the corporation itself. These aspirations can be unrealistic when applied to any corporation, particularly regarding sociopolitical issues.
For Apple, these expectations extend beyond privacy, a stance to which Apple have been publicly committed, to human and civil rights. Though laudable, we need to remember that Apple (or any other company) is still a corporate entity under pressure to remain competitive, and will ultimately make decisions that protect its competitiveness, viability and growth. There is a constant tension between profit and social good, and different corporations and their boards will find different balance-points. One should not be surprised when a company adopts a market-viability sparing solution to a problem rather than one less compromising. We should expect this, rather than social activism.
The Nature of Tech and the Balance of Benefit to Risk
Products come with different margins of safety. One needs to assess, based on one’s own practices and needs, whether the benefits of a product outweigh the risks, or more to the point, whether those risks are acceptable or unacceptable. This will vary by user.
One simplistic means of scoring risk is to categorize these products in terms of their required exposure to the internet, a proxy for exposure to other humans. This is a measure of the controllable and uncontrollable human factor, and being able to manage one’s own risk vs being at the mercy of the risk-taking behavior of others.
The benefit/risk ratio is highest for products where no exposure is required, such as a word processing app on a personal device. The benefit/risk ratio decreases if we need to collaborate with a team on an app or platform, but have little to no control over other team members’ security practices (or honesty). In this case, our interaction with the device/service may be no more secure than its least reliable participant. The benefit/risk ratio may decrease further still on a social media app or professional service, where unknown persons or entities might have access to our activities or data, or even to the host server and data stores. In short, the less personal control over the product/service but specifically our data, the greater the risk.
Next: Tech Risk Assessment: A Three Dimensional Threat Matrix