Alien Technology Required to Beat the iPad

| Hidden Dimensions

“I think the surest sign that there is intelligent life out there in the universe is that none of it has tried to contact us.” — Calvin and Hobbes

Apple’s iPad competitors are having a tough time right now, both in the marketplace and in the courts. It’s going to require, essentially, alien technology to surpass the iPad, not merely copy.

Technology development is not just a matter of R&D, trial and error and learning curve. It’s also a function of how the current level of technology is able to implement cultural, technical norms. Here’s what I mean.

Flying Saucer

When Apple was first developing the Lisa and the Macintosh, the Xerox Star (technically the Xerox 8010 Information System) set the technology standard. And it wasn’t an accident — a lot of research at Xerox went into how people could better use a computer. The Star was the first computer to introduce the technologies we still use today: a bit-mapped display, windows, icons, a mouse and pointer. (WIMP interface.) The user interface metaphor was a crude virtual desktop, and Apple further refined that metaphor, commercially, in the Lisa and Macintosh.

The state of the art in graphics technology allowed for the desktop metaphor, something that was well conceived at the time. By that, I mean that the metaphor had a close correlation to how people really work in an office, and that metaphor was so strong that Apple actually dismissed (the available) color graphics and games at the time. The Lisa was a business tool.

The desktop metaphor was so primal, so appealing that Microsoft wanted very much to copy it. The thinking wasn’t so much “Let’s go beyond DOS” as it was “The Mac is the next Big Thing. We need our own equivalent.”

Of course Microsoft could have been original. They could have conceived of a virtual lifting crane to move documents around. They could have had documents tossed into a virtual office bonfire for destruction. But the thing is, we didn’t have cranes in our offices, and we didn’t start bonfires. We had desks and we moved things around on our desks with our hands. When we were ready to get rid of a document, we crumpled it and tossed it into a trash can. So the metaphor was so compelling that Microsoft was basically obliged to copy the Mac.

Innovation According to Tim Cook

On July 19th, Apple’s Tim Cook said the following.

We have a very simple view here, and that view is we love competition. We think it is great for us and for everyone. But we want people to invent their own stuff. We’re going to make sure that we defend our portfolio appropriately.”

It was one of those Jobsian RDF statements that sounds really spot-on, but has a hidden agenda. It’s really hard to innovate once another company nails the technology driven, state-of-the-art metaphor first.

Once again, extending the Xerox-Mac-desktop example, one could conceive of a new way of turning the page in an eBook that competes with the iPad. One could, say, blow on the page and have sensors pick that up. But that’s a difficult technology to implement with precision. Moreover, we typically turn a page in a real book by swiping. We may have to wet our finger or wear one of those rubber thimbles, but that’s the basic motion. (Another is the more complex act of carving out a single page at the top with our index finger and pulling it. But that’s imprecise and hard to simulate.)

The question here is, once Apple gets out front and implements the common, human gesture in software, trying to come up with a creative alternative is like Windows having a bonfire on the desktop. Sure, it’s different, but it’s wholly unsatisfying and risks being a marketplace failure. Remember Microsoft’s Bob? Lesson learned.

So Tim Cook’s statement is really a wolf in sheep’s clothing. Sure. Go ahead and innovate. Innovate yourself right into oblivion.

Alien Technology

I have fun thinking about a turning point in time in the past when the iPad would be considered extraterrestrial technology. For example, if you took an iPad back in time to 1991, 20 years ago, the astute Intel employee, having heard about the Apple Newton, would say, “I don’t how they did that, we don’t have the graphics technology to duplicate it, but I fundamentally understand what’s going on. Give me time and money.”

E.T.

On the other hand, if you took the iPad back to the U.S. Department of Defense in 1951, 60 years ago, the reaction would be that this device couldn’t even hold one vacuum tube, let alone several hundred million (transistor equivalents). As a result, it would be declared extraterrestrial, and global thermonuclear war could well have broken out over the physical possession of such a device — and its secrets.

Accordingly, we have to believe that, at some point in the future, the metaphor will change again — supported by correspondingly advanced hardware. We’ll have gone from command lines to windows and mice to tablets to… whatever the next major metaphor will be. Perhaps all we’ll have to do is think about turning the book page, and it will simply turn! We’re far away from a commercial implementation of that, but it could happen some day. And when it does, the whole metaphor will shift again. But it can’t be forced. Again, it’ll take time.

Stuck In the Present

Meanwhile, the driving cultural technical norm, the hardware implementation that we have today is the one or multi-finger swipe, the two-finger pinch, the tap and hold, and so on. These are the closest analogs to how we treat physical objects just as the desktop and pointer was the analog to how we handled documents on a desktop. Again, we’re limited by the hardware technology.

It’s going to take a lot of brain power and technology to supplant these original gestures that Apple conceived of and patented. They are timely, human and state of the art. For a company to heed Mr. Cook’s advice above, to re-conceive the tablet interface, to build something radically diferent, for the sake of being different, is to consign any competitor to failure. Strong, arbitrary variations on the common metaphor of the day keep the device from being compelling, magical and more than the sum of its parts. Designers know that. That’s why the WIMP interface survived anf flourished.

As a result of all this, Apple will be fighting its competitors in the courts for the foreseeable future. There’s just no substitute for going with the technology de jour unless you have extraterrestrial technology at your disposal. That’s what it’ll take to beat Apple now. Surviving and picking up crumbs is the practical alternative.

Tim Cook must have had a good laugh with Steve Jobs after that earnings call.

________

Thanks to iStockphoto for the images.

Sign Up for the Newsletter

Join the TMO Express Daily Newsletter to get the latest Mac headlines in your e-mail every weekday.

22 Comments Leave Your Own

Lee Dronick

Sticking with the Julius Caesar theme I must say “worthy cogitations” John.

What ever the metaphor will be in the future I can’t even imagine at this time. It would have to be something that most of us could grasp quickly or else it wouldn’t take off.

Gareth Harris

Good observations, John. A lot of success in evolution has to do with finding a niche, exploiting it and defending it - whether bacteria, mammals or computer companies. Sometimes you can see these territories forming. sometimes we miss the opportunities or someone else beats us there.

For example, I built an interactive spreadsheet for CRT terminals on a CDC6600 in the early 70’s, but did not see carrying it over to a micro, like Visicalc on the Apple. Missed that one. On the other hand, when we did a lot of work on big expensive CRT graphics displays with light pens CDC 250], everybody laughed when I said that this has to be made affordable by switching from vector to raster graphics.

Now I believe the NEXT BIG THING on the horizon is SPEECH - voice recognition. Everybody has been working on this for long time and it is beginning to become realistic. I can envision some kid walking up to one of us fossils working on a keyboard and saying: “What’s that?”

Dogzilla

So the argument here seems to be that it’s hard to innovate, and that once someone has picked the low-hanging fruit, it gets even harder? Isn’t that breathtakingly obvious?

And all this seems to miss a couple pretty huge points:

1) You make it sound as if creating a device like the iPad or iPhone was as simple as being hit on the head by an apple. Nothing could be further from the truth - tablets were around for at least a decade before the iPad, and noone even *attempted* anything like what iOS delivered. Instead, the goal appeared to be to stuff a desktop OS into a new form factor. It really doesn’t take a leap of genius to see this - it just takes huge balls to dump your existing investments and metaphors and develop something new…which no company but Apple was willing to take on.

2) The examples you put forth seem cherry-picked. While iOS gives us direct manipulation of elements with real-world analogs, the overwhelming majority of user interaction is with elements that have no real-world analogs. For that matter, the same is true of the windowed user interface: there are no cursors, drop-down menus or windows in the real world that I’m aware of. Documents, folder and trash - yes, but while those are certainly important parts of the system, it’s pretty clear that Apple’s current move is actually deprecating those elements with real-world analogs. It’s even more extreme on the iOS side: beyond book-reading apps, calendaring apps and games, what interface elements can really claim to have real-world analogs? And yet 3-year-olds can intuitively operate the interface. This isn’t a case of simply looking around your desk and replicating what you see in code - it’s a result of a huge number of hours spent at the intersection of computer science, human interface design and artistic sensibilities.

There’s no reason another company couldn’t make something better, but it certainly won’t come from Microsoft, Google or Hewlett-Packard. Only three guys in a garage would have the drive and fearlessness to develop something so radically new that it would be an improvement on what Apple has created.

Lee Dronick

Now I believe the NEXT BIG THING on the horizon is SPEECH - voice recognition. Everybody has been working on this for long time and it is beginning to become realistic. I can envision some kid walking up to one of us fossils working on a keyboard and saying: ?What?s that??

Or someone saying “What the ....?” smile

Remember the scene in the movie Star Trek IV The Voyage Home where they came back to our time in San Francisco. Scotty is trying to communicate with a Mac and he issues a voice command and it didn’t respond. Someone said to use the mouse so he picks it up and speaks into as if it was a microphone. Then they told him to use the keyboard; “A keyboard, how quaint.”

Anyway getting back to the joke. In a public place there would have to be some way for the computer to only recognize the voice associated with the login account.

Could we Bluetooth thought waves to a Mac?

jadams

Actually, along the lines of this arguement Microsoft has huge potential with their Kinect. But the question is do they know that and can they actually deliver on that potential…?

Nemo

John:  This isn’t as hard as you make out.  A competitor need only get a lawyer with the skills of Darrow, Lincoln, and Sir Thomas More to figure out a way to infringe on Apple’s IP with impunity.  Or, if that isn’t possible or advisable, simply get another Steve Jobs and have him put together another Apple to come up with the new metaphor.  It is that simple; no alien technology is required.

Quit making a mountain out of a molehill.

1stplacemacuser

Now I believe the NEXT BIG THING on the horizon is SPEECH - voice recognition. Everybody has been working on this for long time and it is beginning to become realistic. I can envision some kid walking up to one of us fossils working on a keyboard and saying: ?What?s that??

Speech recognition for data input:  not a chance.  I can’t string five words together verbally without throwing in an “eh”, “er”, “uh”, or some other catchall phrase everyone has.

Also, when communicating verbally, a lot of the communication is in the non-verbal part.  The length of the pause, the intonation, the imperativeness of the voice… all those impart information to the listener, none of which is pertinent to a computer, nor would we want to have it pertinent.  We don’t want the computer to mistake our pause the wrong way.  Maybe we’re pausing because we’re thinking, or we’re interrupted.  Or, we want to make an impact with the pause.  There’s no way for the computer to determine that without additional clues.

Lastly, I can type much faster than I can talk, and for a longer period of time.  Unless one is a professional speaker, speaking for anything longer than a few seconds is difficult.

Gareth Harris

two additional comments:
1 - Sir Harry Flashman: Now that you mention it, I do remember that scene. How quaint - but funny.

2 - 1stplacemacuser: I agree with you on some of the many problems. But we are making progress. Word recognition is better and we are beginning reasonable translations. We have come a long way since I programmed for the Linguistics Research Center at UT Austin on the CDC6600 in the late 60s making one of the first database systems. Some of the results were hilarious. But the parts are gathering.

A real voice interface will involve the many things you mention, plus making IBM’s Watson achievement part of the bargain as well. Now to get Watson into your iPhone.

No matter how daunting, one can see it coming.

Dorje Sylas

Actually, along the lines of this arguement Microsoft has huge potential with their Kinect. But the question is do they know that and can they actually deliver on that potential??

Actually I think they are going to blunder into it rather then have a direct corporate drive. They did the critical thing which was getting the Drivers officially into windows and an SDK out to researchers, developers, and tinkerers. They almost blew it by trying to fight these hacks, but someone in corporate must have been in the head at just the right time.

While MS itself fumbles and stumbles all over a touch interface they’ve never been able to master, other people will do their work for them. The digitized and responding house is closer in their hands then Apples. While Apple may have stuff integrating all over, it doesn’t interact with the user to much. A Kinect Windows PC really could make a difference, especially with face recognition (something Apple seems very intent on doing).

mhikl

I agree. Once the first toilet was invented, what with gravity, an alternative was out of the question.

This is why Google’s Android and twisted sisters have the fastest facsimilators on the go with the copy button in the shape of an apple.

And some think “obvious” shouldn’t be patentable. If that were the case what a sorry state our streets would still be in.

mhikl

Dogzilla, I?m not exactly clear on what you are trying to say but I?ll try with . . .
And it isn?t that it is so obvious that Palm, MicroSoft, HP and scores of others continued to poke at screens with a stylus as the only way to interact intuitively.

And the point wasn?t that obvious is only obvious after the first one sees it. It?s that someone, Apple, got it right because they had not three but one special one and a bunch of others in a building in Cupertino and presented the obvious.

xmattingly

I have fun thinking about a turning point in time in the past when the iPad would be considered extraterrestrial technology.

I’m blown away by this technology even compared to desktop computers from several years ago. There really is something fundamentally wonderful about using a computer that you hold in your hands and interact with your fingers.

I suppose the difference is that desktops are practical, whereas a tablet is visceral… so I would argue that scientists of the 50’s would certainly be confounded by today’s technology, but equally so by the interactive experience.

ipaqrat

Well it’s one thing to be proud of Apple’s accomplishments and revel in the glory, but Apple might be screwed for the next UI paradigm. ‘Cuz Microsoft got the kinect.

I was just thinking about moving the page-turn, zoom, spin, tap-drag… OFF THE SCREEN. Like the XBox with a Kinect. Perhaps there’s a shred or prior art in eye-controlled weapons and the vertical acting mouse…

But can you imagine being able to draw on an iPad using only the sensor that recognizes your hand holding a pretend pen? virtual keyboard, air-guitar, piano…  Maybe you could advance ebook pages by winking right-eye, left-eye.  Maybe a neural sensor that picks up sub vocalizations.

WetcoastBob

There will be no catching-up by competitors.

The MS environment must put on their thinking caps and come up with something new.  I mean Really New!  Think beyond smart phones and tablets.

Andy Capp also made the same comment as C&H some years ago and I have been quoting him ever since.  wink

dhp

Speech recognition for data input:? not a chance.

I agree, but for a more simple reason: it’s just not practical for dozens of people in a classroom, library, or crowded subway train (i.e. shared spaces that are unusually quiet or loud) to be speaking everything into their computers.

Dogzilla

I think the comment “It’s always obvious after someone else points it out” sums this up nicely. However, the flip side of this is shown in this article from Electronista ( http://www.electronista.com/articles/11/08/03/taiwan.builders.complain.they.cant.outprice.apple/ ) - apparently manufacturers can’t compete with the Macbook Air because Apple’s manufacturing process gives them economies that competing manufacturers can’t match with their existing production methods. Which begs the obvious question of why they don’t change their manufacturing methods if they want to compete. It’s that incredible aversion to risk and change that will keep Apple’s competitors from competing effectively - they only want to duplicate what Apple’s created instead of risking innovation.

wab95

John:

Very enjoyable read. The most critical part of this piece is your opening quote. Calvin and Hobbes have solved the Fermi Paradox.

True innovation and discovery are neither easy nor simple. Even if at times they occur by accident, an astute mind must still grasp the significance (e.g. penicillin).

It takes real genius to take a three dimensional function and convert it to a two-dimensional interface that faithfully replicates the experience. It has been successfully done if a three year old, a user-naive or even a cognitively impaired person can interact with that interface as intended and reliably, consistently get the result they anticipated (e.g. the iPad). To so simplify such an analog requires a level of understanding that can reduce a function to its simplest truth, then reconstruct it using the crude medium of human invention, faithfully and for a critical mass of people.

This is why it takes deep understanding to communicate complex problems to children and laymen such that they truly comprehend them. This is why it takes true genius to create a work of art (literature, replication, music) that speaks to generations across time. They reduce truth to its elegant simplicity, such that everyone can appreciate it.

It is precisely because it is so simple, people often ask, ‘Why didn’t someone think of this before?’, or ‘Why didn’t I think of that?’. This leads to another point about creativity and the creative mind.

Intelligence, even brilliance, and creativity are not synonymous. Simply because an organisation has a deep bench of brilliant minds does not mean that it will be a hotbed of creativity and innovation. These cognitive capabilities do not even appear to reside in the same brain locus. Creativity requires the ability to think unconventionally, to see the same things that everyone else does, but to see them in a way that no one else has, to make associations between disparate observations or facts that no one else has made, or to borrow a phrase, to think the unthinkable.

It is that rare capacity then to not only think what no one else has thought, and then to simplify that idea to its pure notes and core truths, but inspires spontaneous and timeless appeal, brings audiences to their feet in standing ovation, or leads people to queue up outside of your retail stores and send your revenues to record-breaking highs quarter on quarter.

Alien technology might be able to take on Apple and give it a run for its money as a creative counterweight (Jeff Goldblum’s success using a Mac against aliens in ID4 suggests otherwise), but only if that alien tech reflects true genius.

Lee Dronick

Intelligence, even brilliance, and creativity are not synonymous.

You, and probably most of the readers here, might find this Salon article interesting “Why we need more mentally ill leaders; New science shows that, when it comes to people in power, sanity is overrated. An expert explains why”

wab95

You, and probably most of the readers here, might find this Salon article interesting

Many thanks, Sir Harry. Great article. The book will inspire many people who struggle with milder forms of mental illness.

I infer from this article that Washington is either rife with garden variety sanity or is overrun by the overtly insane.

mhikl

Which begs the obvious question of why they don?t change their manufacturing methods if they want to compete. It?s that incredible aversion to risk and change that will keep Apple?s competitors from competing effectively - they only want to duplicate what Apple?s created instead of risking innovation.

I agree, wholeheartedly! - others must innovate to compete.

However, Apple’s advantages are monumentally stupendous.

smile  that wad Apple has in its back pocket is catastrophically useful for Apple; not so much for others (understatement).

smile smile Apple plans beyond the next few quarters. Apple seems to have been working on their pad for years, possibly since the retiring of the Newton or around the time of Steve’s resurrection. The iPhone, supposedly, came about when some smarty saw that if the in-progress Apple pad was shrunk and the right stuff added, it would make a cool phone; thus, Pad was put on the back burner for the time being. Not only was the mobile phone industry taken by surprise, Apple may have been too.

smile smile smile The new, blue Reaganomic market structures, based on short term gains emphasised immediate returns and de-emphasised long term planning, investment, people, saving, natural progressive development, (the old culture). Apple under the tutelage of its worthy dictator seems to have been one of the few corporations that didn’t buy into this nonsense. Planning and tinkering is Apple culture and has been from its inception. It?s a dude thing for guys who like to tinker and improve on things most people see as already perfect or good enough. (it?s like an itch that never heals and can never be left alone.- not everyone experiences it.)

smile smile smile smile And all this leaves the shareholders fickle and powerful. No one, (except I and other Apple-like fanatictoids) will stick around to support and stay invested (monetarily and/or intellectually) in a company that is not bringing in (maximizing) returns. Look at the dizzy dance Apple stock has been in- for ever. HP et al cannot take the risks (you and I know it needs) to take to succeed against the risks Apple took to put it where it is today. They are in the Milton Friedman short term game. Time is not on their side. Time is the Grim Reaper.

(I would never presume to speak for JM, or any other TMO editor/contributor, but from many TMO original articles I get the sense they says, much more eloquently, most of what I have tried to say here.)

Lee Dronick

I infer from this article that Washington is either rife with garden variety sanity or is overrun by the overtly insane.

For most of them I am going to go with the sanity option. Last evening one of my Facebook friends quoted from Catch 22, “Some men are born mediocre, some men achieve mediocrity, and some men have mediocrity thrust upon them.”

I consider myself eccentric not crazy.

geneking7320

Remember an episode of The Man From UNCLE where someone
invented a machine that could read minds?
That would be the joint!

Log-in to comment