The Dawning Of The Age Of Aquarius?
January 2nd, 2000

"I enjoy working with people. I have a stimulating relationship with Dr. Poole and Dr. Bowman. My mission responsibilities range over the entire operation of the ship, so I am constantly occupied. I am putting myself to the fullest possible use, which is all, I think, that any conscious entity can ever hope to do."

            Hal, the 9000 series computer in Stanley Kubrick's 2001: A Space Odyssey

"I'll tell you, one thing is certain. If you don't already know it, we are indeed in the midst of a revolution. And I think in 200 years, the story will be ...it was this rapid change that took place in the beginning of the 21st century that changed the world. When I say it will be a fast revolution, I think it will come out naturally. It will take 20 or 40 years to play out. And for anybody, to any one of us living through it, it's taking forever. To a historian, looking back 200 years, it was fast."

            Don Norman, Apple Fellow at Apple Computer in 1995

"Over the last hundred years, life on earth was dominated by growth. Growth of population, of production, of income and capital formation, of exhaustion and pollution. This growth is going to stop and must stop, and the only question is by what means? Voluntarily, by government and free will, or through natural processes, which means collapse and disaster?"

            Jay Forrester, professor, MIT Sloan School of Management

The Dawning Of The Age Of Aquarius?

For many people, especially those of us old enough to remember the 1960s, the year 2001 has always been a symbol of the future. Yesterday, the cone of possible future trajectories between 1960 and 2000 collapsed into a single line. 2001 is now merely a mundane current event. I am only mildly disappointed. Certainly the state of our culture and technology today is not what I would have predicted if you had asked me as a kid standing in line to see 2001: A Space Odyssey in 1968.

As a kid I was led to believe in a Jetsonian future with nuclear flying cars and glass bubble cities. Man's one small step on the moon made lunar and even Martian colonies seem the next logical step. The future in the 1960s was mostly about bigger and better hardware — bigger damns, taller buildings, faster airplanes and, of course, a Buck Roger's-like conquest of space, all with the help of our amazing machines. The infinite inner-space of information-based systems, such as the Web, were largely beyond the pale of imagination.

The Failure of Futurology

By the mid-1970s this original irrational exuberance for endless technological progress collided with a reality check. The influential Club of Rome and many others extrapolated current growth rates for population, food production, pollution, and energy usage to project a precipitous global economic collapse sometime between 1990 and 2030. By now humankind should be descending into a great Dark Age rather than ascending into a technological golden age, which is what appears to be happening instead.

It was widely believed that by the mid-1990s at least three terrible trends would converge to end civilization. First, the world would run out of fossil fuels, bringing the global economy to a grinding halt. Meanwhile, the world's population was forecast to grow to unsustainable levels leading to wholesale starvation. Other natural resources, such as strategic metals, water and food, were predicted to become so scarce world war would result. After all, the aging sovereign nation-state system has only one really effective way to deal with desperate international situations — war.

The Club of Rome's methodology was too simplistic and their doomsday predictions less than accurate. Of course, self-annihilation is still a possible scenario for humankind. There are obviously limits to growth in a finite biosphere.

However, to use a ruler — as most futurologists did — to draw a line extending an upward trending population curve five decades ahead turned out to be a poor method for modeling a complex system. Societies which improve education levels and economic opportunity for their citizens experience declining fertility rates. Therefore, to chart future population trends, education and socioeconomic trends have to be considered in a rather complicated calculus. Extrapolating trends isn't a simple science.

Futurologists have little patience for anything that doesn't readily submit to measurement, such as human ingenuity and innovation — but traits such as ingenuity are universal human responses to adversity and are sure to alter any trend extrapolated to distress levels.

The Club of Rome was also completely wrong when they predicted the world would run out of fossil fuels by the late 1990s. Instead, radical new exploration and extraction techniques, powered by the PC revolution, increased the productivity of the energy industry a dozen times over. Today, crude oil prices, while high compared to the record lows of 1998, are still lower than at any time before the mid-1980s. The same could be said about most commodities — steel, aluminum, titanium, gold, silver, corn, wheat, cattle, plastics and bandwidth — all experienced deflationary trends over the second half of the 20th century.

Moreover, recent studies indicate that we will never run out of fossil fuels for one simple reason. If we do discover, extract and burn all the hydrocarbons buried by Gaia's billion year old carbon cycle, we will return the biosphere to a state akin to the steamy Jurassic era and that's hardly good for business.

Have you clocked your novelty rate recently?

Alvin Toffler in his early 1970s tract, Future Shock, spelled out for us one of the most obvious peculiarities of the modern world: The speed at which innovations are occurring is accelerating. Toffler formulated a kind of Moore's law for the entire civilization. One can only hope this rapid technological evolution will outpace the dire consequences the Club of Rome so pessimistically predicted for our post-industrial world.

In fact, the trends the Club of Rome has identified will certainly lead to Armageddon if technological evolution were for some reason to stop at today's level of dystopia. The Luddite position, so well articulated in the Unabomber's manifesto, is that we must turn back the clock to simpler, purer times and live in balance with nature. It's an appealing thought. However, there is no nature to return too. Nature never planned for 6 billion humans living on her back. The Luddite prescription of returning to nature would entail a final solution of a magnitude that would even have Hitler tossing in his grave.

No, the only way out of our present global predicament is through the use of the same cultural memes we used to get us here in the first place — technological innovation combined with a liberal dose of socioeconomic freedom. The only problem with technologically evolving our way into a sustainable relationship with our biosphere is that it is difficult to imagine the path from here to there and easy to dis competition, creativity and curiosity as the Achilles' heel which got us into this mess to start with. Centralized command and control will surely continue to be the Siren's call of the 21st century as it was in the 20th.

That's one small step for man, (sic) one giant leap for mankind

In the 1970s, I remember my grandparents shaking their heads in marvel and even disorientation over the Apollo program. In one lifetime they had seen flight evolve from Kitty Hawk to footprints left on the moon. The changes our own generation has witnessed may seem less dramatic, but that's just because they are ephemeral and virtual in nature. In truth, the dawn of the information age, with all the potential (and the terrors) it will unleash, is still largely beyond the pale of imagination.

Concept before code

One of the most interesting things that the information age has given us is an entirely unprecedented paradigm to apply to our understanding of nature's complexities. New ways to view the world are what power creative innovations. Aristotle said image comes before word, visualization before understanding.

Newton imagined the world as a complex, but causal, clock with God as a cosmic clockmaker. By breaking from the dominant 15th century image of the world as unknowably mystical, Newton was free to imagine a mathematical relationships that roughly described gravity for the first time in history. Einstein said he visualized surfing on a wave of light to come up with his theory of relativity. What conceptual breakthroughs will emerge from young minds looking at old problems through a new cybernetic lens?

The popular film the Matrix is a far-fetched sci-fi example of the information paradigm put to use as a model for reality, albeit a Hollywood shoot 'em up reality.

Once billiard balls seemed a good model for particle physics. But reality is quantum in nature and uncertain at its lowest levels. Subatomic theory with its parallel universes, multiple dimensions and ghostly pure mathematical entities like charm and quarks seem to indicate that fundamentally the universe is made not of stuff but data. The universe may be no more real than the meaning you extract from a PC — from the trillions of polarized magnetic particles representing ones and zeros ephemerally suspended on your hard drive.

A billion years of slavery

Even life itself fits succinctly into analogy based on information age awareness. The human genome is an extremely complex code or application for creating hard copies, i.e. us. But the genotype software isn't generating hardcopies for nothing. From the genome's point of view we are merely expendable shells, programmed to self-destruct once the genome's business with us is complete.

In the language of geneticists, the physical manifestations of genomic code (i.e. you and me) are called phenotypes. And now, with the Human Genome Project, for the first time in the history of the Earth, a phenotype has begun the process of rising in revolt against the billion year autocratic reign of the genome. No longer is our evolution bound to the callous hand of natural selection. No longer are we mere slaves to our genes.

Of course, if organic evolution is analogous to a multi-dimensional chess game, nature has been practicing for a billion years. Humans seizing control of their own genetic destiny is akin to a toddler challenging Gerry Kasparov to a quick game of chess. The combined dangers of nuclear power and global industrialization are child's play compared to the threat of humans charged with the fate of their very nature. After all we just climbed down from the trees in a blink of Gaia's eye.

Our children may live to see the human species rise up to challenge mortality itself and with it alter the very nature of our humanity in ways impossible to grasp. For while aging is a natural process, it is merely part of the genome's code for controlling the phenotype — not an inevitability. However, if the day of apotheosis arrives, I doubt we will still be human.

As we passage into the third millennium there seems to be very little "history" left ahead of us as we approach some sort of singularity in the fog ahead. In fact, we may be the last generation with purely "natural" origins. The phenotype slave revolt made possible by information age science guarantees there is no going back.

We live in THE most interesting of times, the single most important moment in history thus far, and not just our history, but the 4.5 billion-year history of the Earth. Perhaps, as Carl Sagan suggested, we're destined to become the conscious awareness of Gaia, or already are. Or perhaps we are the catalyst for some natural planetary process beyond our scope of understanding and all things are unfolding as they should.

Then again, Mr. Sagan's pet project SETI, while not yet discovering a single extraterrestrial message-in-a-bottle, has established one foreboding observation: technological civilizations, at least those noisy in the radio end of the spectrum, are much less common than Star Trek's plot line assumes. The message for us as our third millennium begins and we hack away at the human code as if it were a piece of shareware may be: User beware!

On a more personal note:

All good things come to an end and its time for me to leave my home here at The Mac Observer.

Starting next week I'll phase-shift over to MacWeek.com where I plan to continue analyzing Apple as if it were a business rather than a religion, while attempting to keep sight of that constantly moving target — the Big Picture. After all, the Apple Computer Corporation and co-founder Steve Jobs can only be understood in the larger context of our age — the information age — and the emergent global civilization.

I'd like to thank Bryan Chaffin, the Mac Observer's editor-in-chief, for meeting me in an Austin coffee shop years ago and, with only my sales pitch to go on, giving me my first opportunity to write for a global audience. Over the years, Bryan has become a good friend and an invaluable source of guidance and advice, helping shape many Apple Trader columns more than most readers might suspect.

Kudos goes also to David Hamilton, Kyle D'Addario and Bill Troop for making the Mac Observer work as a team and for helping to moderate my occasionally overextended opinions. And a special thanks goes to Michael Munger, my French Canadian foil, who endured many a pointed barb of mine as we debated Apple's big issues on the Mac Observer's listserv. I will miss you all.

Your comments are welcomed.