1. Crises and exploration for new resources have historically fueled innovation, and do so now.
2. Most innovations have benefited a privileged few before enabling mass benefit. This will continue.
3. Our hardware is reasonably mature; new major gains will come from AI/AR – enabled software.
4. The next ‘big things’ will affect our sensory perceptions of our bodies (health) and the world around us, including remote interaction in real time.
Following the two recent private sector launches into sub-orbital space, one from Virgin Galactic the other Blue Origin, there has followed criticism not simply of ‘billionaires taking joy-rides’ but of this remaining out of reach for the masses and being tone-deaf to other more pressing needs ranging from the environment and climate change to fair wages and working conditions. Much of this criticism is thematically similar to that voiced by an earlier generation to the ‘Space Race’ writ large and to the Apollo Lunar missions specifically.
And lest Big Tech feel left out, similar criticisms have been leveled at the major companies as well during the pandemic, notably Apple, for continuing to generate products and services that people still had to purchase at premium price, at a time when so many had lost their sources of income.
While these arguments have merit, when weighed against the evidence of the circumstances of our greatest periods of innovation and growth, they fail the balance test of truth.
The Fallacy of Waiting for the Right Moment
Our collective history, across cultures, is a testimony to the time-honoured saying of ‘necessity being the mother of invention,’ a rendering from Jowett’s translation of Plato’s Republic written in the 6th Century BCE, and itself a study in human nature and governance.
We can find no period of profound growth in any culture or society that was not facing a crisis of famine, war, pestilence or plague, or other existential threat, or an imperative to extend its power (often to forestall the threat of one of the above) in order to expand its assets and overcome an identified vulnerability, such as building roads, bridges and aqueducts to new resources.
And specifically, when it comes to exploration, no society at any time in history has ever opted to wait until they had solved longstanding domestic problems before embarking upon exploration. In fact, the expectation of benefit has been a principal driver of exploration across recorded history. It is innovation and discovery that drive us forward out of crisis.
Crisis and Opportunity – The Crucible of Innovation
Tangible imperatives, whether in the form of threat or aspiration, drive humanity, whether as individuals or as a species, to ‘think outside the box’ and to innovate. As Samuel Johnson so memorably put it, ‘Depend upon it, sir, when a man knows he is to be hanged in a fortnight, it concentrates his mind wonderfully’.
An example of threat response was our centuries-old innovation to survive smallpox, and other lesser infectious diseases like cutaneous leishmaniasis, with controlled inoculation or ‘vaccination.’ Another more recent example is radar, developed in Britain in response to German aerial assault in WWII, and now repurposed to guide passenger and commercial transport safely from point to point.
An example of aspiration, exploration of our planet through migration and settlement, and more recently, human and robotic space exploration in quest of knowledge that improves our understanding and management of our home planet. Another is the repurposing of rocketry and satellites, both of which were designed for military purposes, for space exploration, global broadcasting and communications and planetary science to the benefit of all humanity.
More than any other incentive, however, it has been threat mitigation that has inspired some of our most widely adopted innovations, such as agriculture and animal husbandry, and with the expansion in size of human settlement, grain storage and accounting, in order to forestall starvation in lean times. Most of these, we now take for granted as standard practices, but they were innovations dearly purchased with the coin of human life.
And then there was iPhone
More recently, we have addressed illiteracy and its impact on information access through what early critics dismissed as an over-priced, transient toy; the iPhone cum ‘smartphone,’ which was initially, like so many innovations, directed at an affluent audience with disposable income. With its adoption as a mere adjunct to ‘computer technology’ (it was initially tethered to the then ‘Mac ecosystem’), it became the supercomputer of the masses, thanks to its Mac-inspired graphical user interface.
In no time, and much to SJ’s chagrin, free market competitive forces and economies of scale took over, generic product prices decreased, and soon, even poor rickshaw pullers in South Asia were carrying them about tucked away in their lungis; and courtesy of that GUI, could operate them to conduct their affairs and access information, despite being illiterate. This was a society-transforming leveller that put new tools into the hands of the previously marginalised, including banking, remote medical assistance, information, communication and entertainment (often pirated, but we digress).
We must accept an ugly truth. We have never been driven to solve the problems of the poorest or most vulnerable amongst us in the absence of collective and personal peril. Importantly, many of the tools that we require for one problem were developed in response to either a separate problem or an exploratory quest in search of new resources, that through imagination, were repurposed to solve longstanding unrelated challenges.
By now, we should appreciate that this is the consistent pattern, driven by human nature. The difference today is, given the relative maturity of our tech hardware, many of these new tech offerings suffer less market-force inertia, and are rolled out in ever shorter time intervals from conception to mass production.
Next: It’s about the Software, and The Next Big Thing(s)