What Lies Beyond the Macintosh Desktop Metaphor?

| Analysis

The Post-PC era, the era of iPads and other tablets, suggests that it is time to re-evaluate the archaic computer interface developed by Xerox and Apple 30 years ago. However, before we get too excited, there are lots of questions to ask and answer.

The interface we use on our PCs and Macs is generally known as WIMP: windows, icons, menus & pointer. The assumption is that the pointer is some kind of device like a mouse, trackball or a trackpad. We move a cursor to an object and perform some kind of drag or click.

Nowadays, we're all very familiar with iOS and how that differs from the OS X interface, and, naturally, questions arise.

  1. Is there a better way to interface with a desktop computer?
  2. Is iOS one of those ways?
  3. If not, what distinctions are important when considering the differences in the tasks and the way we use a tablet and a desktop or notebook computer?
  4. Is the movement to a more modern interface driven by user needs, a natural technical evolution, or developer considerations?
  5. What features of the traditional WIMP interface might be lost in a more modern interface and how important are they?

 

Like previous generations of interfaces, this too will pass. (Credit: Apple)

The Guardian

In Star Trek:TOS, "The City on the Edge of Forever," the Guardian asks, "Since before your sun burned hot in space and before your race was born, I have awaited a question."

Actually, I have several. And so, here they are:

1. Is there a better way to interface with a desktop computer? The answer is, yes, but not everyone is going to embrace it after using Macs since the beginning of time. Well, 1984 anyway.

I recall the transition from DOS to Windows and Apple II to Macs. It didn't happen overnight, and not everyone was enthusiastic. Heck, there are still people today using DOS.

Gregg Keizer quotes Patrick Moorhead, principal of Moor Insights & Strategy,: "Just as it took 10 years for DOS to get out of everyone's system, only when 'Modern' is completely ready will the desktop disappear. It will take five, six or seven years, to bring all the important desktop apps into the Modern UI."

We got over this.  We'll get over WIMP.

The key thing to remember is that the original character-based interfaces, UNIX, Apple II SOS, and DOS, were all that the CPU/GPU could handle. As hardware improved, it became natural to represent our window (with lower case) into the workings of the computer with a virtual window, indeed Windows and Mac OS. Today, with much more amazing hardware at our disposal, it makes sense to marry that technology with new ways of interacting with a computer. For example, Siri, touch screen interfaces, and perhaps 3D hand motions.

This evolution of the interface, driven by hardware and advanced software, is a natural thing, not something to be dreaded.

2. Is iOS one of those ways? For now, Apple, at least, is acknowledging that there are different tasks that are accomplished on the desktop and notebook computer and the tablet. Tablets are limited in their size and battery power -- and therefore their CPU/GPU capabilities. So it doesn't make a lot of sense to cast, brute force, a tablet oriented interface, iOS, onto a much more powerful notebook or desktop computer.

For example, Macs are held differently, and so the notion of the "Gorilla arm" comes into play. We touch our tablets as they rest in our laps, but we don't want to hold our arm out for long periods of time manipulating a more powerful device.

At least, that's what Apple is doing. Microsoft and its partners, however, seem to be rushing headlong into a unified touchscreen interface with Windows 8 and the tiled UI common across PCs and tablets like the Microsoft Surface. I asked Microsoft about whether the plan is to eventually get rid of the classic Desktop, but they declined to comment for this article. There is much discussion for and against a vigorous Microsoft vision to deprecate the classic Desktop and move smartly to the tiled UI exclusively.

It's unlikely that Apple will follow Microsoft. For now, invoking some of the best ideas of iOS into OS X appears to be what Apple is doing without suggesting that iOS should be the future UI for all Apple computers. That process has been called iOS-ification of OS X, but that doesn't mean that OS X becomes iOS or dumbed down. iOS currently doesn't have the flexibility (legacy apps), facilities (daemons), legacy hardware support (drivers) needed for serious desktop or notebook use. So that means one has to ask certain additional questions. Namely:

3. What distinctions are important when considering the differences in the tasks and the way we use a tablet and a desktop or notebook computer?

Steve Jobs once said that desktops are like trucks. We need trucks to do certain kinds of heavy lifting, but they'll become rare compared to passenger cars, used by most people. For example, to conduct certain kinds of system maintenance, advanced users (truck drivers) need a very detailed kind of window into the file system, perhaps UNIX permissions. On the other hand, many users only need to see some basic information about when they created a document.

So the real question is: when we decide what kinds of information needs to be displayed, like file information in the Finder, indeed even the data folders and OS files, are we dumbing it down for advanced users who have come to expect it? The answer is probably yes, and so there will remain a need for apps like PathFinder for the truck drivers.

On the other hand, what facility for creation of content could be lost? The answer should be none at all. So the real question relates to factoring technical information about the system and the means by which content gets created. iOS does that. When we confuse the two, we can become unglued and nervous about the developer's intentions.

Screen sizes come into play strongly. Tablets, so far, are perfect in the 7 to 10-inch range. That dicates the kinds of tasks we tackle. Very large, perhaps even dual 27-inch displays, lead themselves to much more ambitious tasks and object manipulations. OS X remains superior there and isn't likely to lose that advantage.

Desktops have access to vast amounts of electrical power. That implies GPUs that are an order of magnitude or more better than a tablet. That facility will be invoked to our advantage rather than a more limited tablet OS scaling those heights. At least for now.

A next generation user interface will use advanced hardware to make it easier, more natural and more intuitive -- more human -- to interact with a computer. For example, we don't generally have a strong urge to observe the DNA activity or digestive processes of our cats to enjoy their companionship. (However, we do pray they won't puke on the carpet.)

4. Is the movement to a more modern interface driven by user needs, a natural technical evolution, or developer considerations?

In the end, every computer maker wants to sell more devices and make customers happier. When technological development makes it possible to achieve a higher level of abstraction, that will come as matter of course, just as WIMP replaced the command line.

I don't see this kind of evolution as some evil plot being perpetrated for the selfish ends of the OS designer. We simply can't move forward in technology without higher levels of abstraction and more human-like interfaces with computers. In fact, no OS developer is going to consciously take away our ability to create content. And if we seem to lose some element of nerdiness, we can always apply for that truck driver's license to hold us over while we size up the next generation UI.

Heck, after awhile, we'll get to like it. It's a natural technical evolution. It may just take 10 years to be able to say, "We knew this was the best idea all along!"

5. What features of the traditional WIMP interface might be lost in a more modern interface and how important are they?

One has to admit that a computer mouse is a crutch. In principle. it's just as unnatural as a mechanical typewriter of old. In both cases, it was the only technological means to achieve an end. The mouse-like pointer is going to disappear.

Windows and icons on the other hand have their uses. While the representation of our content may change a little, we can still expect to visualize, in some way, what we've created, even if we don't know (or care) where it's stored. iOS does that nicely. So, we've gone from this:

To this:

The real questions are: can we keep from losing our work? Do we know where to find it? Can we recognize it when we find it? Can we back it up? If yes, then fine, and we let the computer take care of the housekeeping.

Touching apps to launch them, touching the screen to manipulate content (or asking Siri to do it) or something similar with a wave of the hand doesn't really make us stupid. It's just a faster, more intuitive way to get the computer to cooperate. The real issue here is that we often don't have a clear idea about how incremetal evoluton of UI technique will carry us forward; instead, we tend to invoke preconceptions of abruptness and awkwardness that lead to alarm. We jump to conclusions. We shouldn't do that.

On a desktop, the act of creation involves folding in content from lots of sources of different kinds, and so iOS with its data model and sandboxing probably can't handle that. That's why we'll see some minor iOS-ification without damaging the ability of OS X to create content.

In general, in the past we needed to "see" our content in certain ways because we had limited ways of "mainipulating" the content. Seeing ino the guts meant verification of an action. As we advance in our abilities to interact with the computer, the historical crutch of how we see content will subtlely change. But that doesn't mean we won't "see" our content in ways that are useful to the task at hand.

Each year at Apple's World Wide Developer Conference (WWDC) we get a glimpse of the direction Apple is headed. The changes happen incrementally, year by year. We don't fall off a cliff. We gain facility but don't lose capability in our OSes. But we do encounter change even as we try not to grumble too much and keep the big picture in mind.

That's how we stumble forward, learning as we go. It's been that way since day one of personal computers.

Sign Up for the Newsletter

Join the TMO Express Daily Newsletter to get the latest Mac headlines in your e-mail every weekday.

16 Comments Leave Your Own

Bosco (Brad Hutchings)

One has to admit that a computer mouse is a crutch.

Sure, until you try to do pixel-precise selection of graphics, stick an insertion point between two letters, etc. Some tasks that are simple with mice and trackpads are a real challenge with touch.

And another place where tablets suck… Reading. You can prop a notebook up on your lap. Same with watching a movie.

I picked up a Chromebook (Samsung) to replace an aging Mac for my 90-year old grandmother. I’ve been playing with it for a couple weeks while they were on vacation. Not gonna lie. I’m in love. If it had touch and Android apps, it would be beyond perfect.

Gareth Harris

John,
Just like the difference between truck and passenger car, there is a difference in the uses of computing. In the early days, before timesharing, before UNIX and even before one computer per user, we did not require a user to operate a program. Comparing it with the machine shop I grew up in - it’s like the difference a manual mill and a CNC work center.

I often watched the CDC 6600 under load in the 1960s and 70s, with jobs flowing through the queues on its screens. Some were long running NSF chemistry calculations which paid for the machine, some were classwork for UT Austin and some were tests of the early database system we wrote. Sometimes the phones rang almost immediately if I made a mistake and blew the users out of the water.

I wrote some of the graphic software for out installation and the one interactive workstation we had at first - a large CDC 250 display with light pen. The huge screen had a monochrome display, but it was vector driven. I often did 36-48 hour stretches in front of it. It was an extremely productive setup costing about $60 million 2013 dollars. A $500 iPhone is thousands of times faster today.

The point I am driving at is that as much as I like my laptop, iphone and iPad today, large display and fine control of screen interaction is still needed with lots of free running processes not requiring a human operator. Enough power to flip between many screens, fine enough point detail for CAD or massive manipulation of text or code, brute CPU power for multiple simultaneous compiles and links while also operating graphics CPUs, all this is needed for the developer, engineering or scientific user of today.

Using end user gear for this work is like hitching your Prius to a railcar. Laptops and iPhones are OK for small personal use but for real work I don’t want small doodads, I want some power!! If Apple won’t supply it, I will be looking elsewhere soon.

aardman

Touch interfaces can be easily adapted to desktops in a way that eliminates arm weariness, you just need to separate the graphic/visual element from the touch element.  I.e. a trackpad that maps pixel for pixel to the monitor.  (Jobs already said something to this effect, though he didn’t say anything about pixl mapping.)  Would help to have an on-screen graphic indicator (such as fingerprint animations) that shows the exact spot on the screen that corresponds to the areas you are touching on the trackpad.  That would aid fine graphic manipulation.

John Martellaro

Gareth: One of the reasons I wrote this article was to assure power users that the evolution of our personal computer UI isn’t a drastic, abrupt divorce from our more intimate connection to a computer.  After all, here we are in 2013, and we still have access to the command line in OS X 10.8.x.

The Terminal app is not available in iOS precisely because of the fashion in which we use a tablet makes it a waste of time.  That means, I hope, that our facility to control our larger, more powerful devices will always account for the power and complexity they bring. Or else a phenomenal new UI technology will make it irrelevant, and we’ll marvel at how shortsighted we were.

In my opinion, we’re not likely to ever have to say that the Mac can no longer serve our creative (and power user) needs. Unless, of course, iOS, a very advanced UI and serendipity eventually surpass the Mac and finally put it to rest.  Either way, we as customers win.

Fear of change makes us believe we’ll lose something important, but the history of PCs & Macs suggests that that simply isn’t so.

wab95

John:

Very nicely articulated. Equally, I appreciate your taking pains to point out that, not only does the history of OS evolutions suggest that change is not abrupt, but two other features are either preserved or more generally enhanced with change: it is not binary (i.e. a feature is either present or absent); essential or even simply important capabilities to end users are not lost.

One of my psychiatry profs (the one who covered ‘life events’ and the stresses and challenges - indeed the anxiety - they inevitably provoke) pointed out that with any major event, e.g. marriage, birth, death, job or career change, physical location change, etc comes destabilisation to prior and even long-standing relationships. Parties struggle with the uncomfortable recognition that they are often-times re-evaluating that relationship as it struggles towards a new equilibrium - a process that can be as painful as birth itself (something that as a man I can only imagine). The role of the psychiatrist is usually one of assisting the individual to recognise the normality of both change and the anxiety it provokes, and to appreciate their own growth and maturation as they go through this life event, and to use these to their advantage rather than be paralysed by it. I am confident that, were that same professor giving this course today (indeed, he probably still is) he would add ‘change in OS interface and/or feature-set’ to the tally of anxiety provoking life events that result in stress and the dissolution of relationships, in this case, with a computer and/or its manufacturer. Indeed, I am confident that somewhere a psychiatrist is making a tidy sum by treating end users and IT professionals who’ve suffered the trauma of transformation from Classic Mac OS to OS X, the rise of the iDevices, the iOS-ification of OS X, and the pending fate of the Mac Pro (perhaps even the switch from Windows 7 to 8 and the relative decline in relevance of all things Windows/MS). Perhaps computer interface and hardware disorders will make an entry into the DSM and become a sub-specialty.

The point is, anxiety about the future is normal, including the future of our computing devices which have become essential and deeply personal tools in modern life. It is also normal that the less we comprehend the direction of change, and/or the less control we feel we have over that change, the greater the anxiety we will have about it. As with most life events, however, our apprehensions are generally far greater than warranted, if warranted at all.

I cannot imagine that Apple, or any competent computer company (or electronics company writ large) will knowingly discontinue features and functionality upon which important sectors of their client base depend, unless those sectors are so small that the cost benefit calculus makes those features non-sustainable, an important caveat particularly for TMO readers who are not representative of the majority of OS X/iOS users. I would argue that this is particularly true of a company like Apple that actively monitors how their products and services are used.

Finally, I fully concur, and have argued before, that the user interface of future computers, irrespective of physical form factor, will increasingly permit more natural human interactiveness, which will include not only visual display as do modern GUIs, but tactile and voice interchangeably, and even exploit visual and gestural communicativeness as we do with each other (I should be able to communicate with my computer with my eyes or even facial expression and have it respond accordingly). This cannot be accomplished within the confines today’s most modern OS GUIs, and so they will continue to evolve to provide a much richer end user experience. In the end, equilibrium will be achieved (perhaps not for everyone, and for them there will always be comfort boxes sporting yesteryear’s interface - and with it, its limitations). For most of us, the turning point in achieving that new balance will be in the enhanced functionality, productivity and, let’s be honest, the sheer joy we will derive from using that new interface.

Lee Dronick

I liked using a mouse, but have since switched over to mostly using the trackpad which I find much better. I am also using Siri more and more and would like to see it on Macs. I like the “new” Apple chicklet style keyboards, but wish that the wireless one had a number keypad. Anyway, I could see myself using an iPad type of keyboard to control my Macs. It could change between being a keyboard, a drawing tablet, a microphone for Siri, and other things.

We need choices of input devices and or methods for our Macs. As Brad mentioned there is a need for precise control when doing graphics and such; ofttimes in Illustrator I am moving a control point in small increments through the use of cursor arrow keys.

David

Everyone seems to think the next revolution in computers is going to be via the user interface. I think instead that the next true revolution will be limited artificial intelligence providing a true agent that helps you accomplish tasks. How far are we really from being about to say (or perhaps more realistically type) into a computer: “Collect information for me on available HD television sets and give me options for purchasing either online or at at nearby store.” The computer, acting as your agent, researches this information and presents you accurate, up-to-date information tailored to your preferences. We have kludgy ways of doing this now, on Amazon and other sites, but it still takes too much personal time.

Think about the real novice computer users out there who only use a fraction of the capabilities of the platform. How great would it be to have an interface that truly interacted with the user:
COMPUTER: What would you like to do today?
NOVICE: Umm, I need to do a report for school on rainbows.
COMPUTER: I have several programs that could be used for reports. The most popular are MS Word and Apple Pages, which I have installed on me. Or I can download a new application from the App Store. What do you want to do?
NOVICE: How about Pages?
COMPUTER: Starting Pages now. Do you need a tutorial?

Or how about when your research a topic like rainbows…or average start-up costs for 24-hour breakfast-oriented restaurants? This type of search could take days or weeks (or more) of searching on Google, Chamber of Commerce websites, etc, separating out irrelevant search returns from relevant, and finding the specific information in the mountain of info that is the WWW. When a computer can do this for you, without significant direct interaction, the world will change.

Think JARVIS from Iron Man. Doesn’t have to be nearly that interactive or smart to vastly change the experience for the better.

Gareth Harris

John,
Thanks for your reply. I feel that sometimes we initiate a new planned interface and other times we create one as a response to a need or new doodad [technical term].

My point was that we are now often thinking in a box, envisioning computer use only as laptops, iphones, iPads, etc. with one person attached. Sometimes a simple change in technology provides a new path even if it is only to incremental development.

For example, just as smartphones are replacing the point and shoot camera, iPods now provide an affordable encapsulated mobile computer for all sorts of applications on a work floor as scanners and comm devices. And, the advent of HDMI interfaces provides large cheap displays for communication to a group or in public areas.

Sometimes you can change behavior. Most of us have been in work situations where people are supposed to enter data into a computer or keep up with paperwork. The major flaw in these approaches is that people feel it interferes with their work and resist, leading to GIGO. When you reduce the work interference to every employee waving their personal ipod over a QR code and listening for a beep, the data collection is almost painless.

It may be that the interface will morph, not because of GUIs, but because other modalities become affordable, like gestures using “Leap Motion” controllers, or plodding but continual progress in speech recognition.

We can be motivated to create new interfaces to provide more access to non technical users or also to increase the efficiency of power users. There is so much technology on the shelf right now that you can create almost anything you imagine and new doodads show up every day.

We need to expand out of the box and see new uses in new settings as well as for old power geeks like me. The work settings will drive the interfaces. As Einstein said, imagination is the key.

mrmwebmax

+

While I agree that, yes, the user interface can and should continue to evolve (I admit I was one who had a knee-jerk reaction against the Mac when it first came out), I have to second what Brad and Lee said: For certain kinds of work, the mouse is simply the best tool for the job. I’m working on removing the background from a very complicated photo of industrial equipment right now, and could never dream of trying to do this with touch. It’s frankly hard enough with the mouse! That said, I do enjoy much of the iOS-ification of OS X, and am perfectly comfortable doing heavy-lifting in Mountain Lion.

Gareth Harris

Speaking of interfaces, here are a couple of links just for grins,
then I will quit. I promise.:

“The Macintosh uses an experimental pointing device called a ‘mouse.’ There is no evidence that people want to use these things.”
-John C. Dvorak, SF Examiner, Feb. 1984.

And, the Macintosh mouse history:
http://www.youtube.com/watch?v=PYi36eNoIxs

John Martellaro

Gareth:  That Dvorak comment is one of the classic quotes in the history of personal computing.  I wish I had remembered to include it in the article.  In any case, you’ve made it part of this article for all time.  Thank you.

webjprgm

You mention that separate interfaces like PathFinder will have to exist for power users.  One problem with that is when a power user has to fix a normal user’s computer where that power interface is not installed.  It makes sense to hide information normally but the facility to access it must still be baked into the standard system in order to be serviceable.  My strategy today is Finder + Terminal, no reliance on third-party apps.

Note that keeping Terminal around means even with the GUI we’ve kept the text-based interface around for power users.  So whatever we do next, we’ll be keeping something from text and 2D window interfaces around.

One more function of a mouse is tracking what the user is interested in, not just touching things. It’s the “hover” feature, whereby buttons can glow, help text can pop up after a delay, and OS X knows which window you want scrolled by that scroll wheel.  So touching the screen, asking Siri, and waving a 3D gesture are not enough alone. A 3D pointing gesture or eye tracking could be used to track what the user is interested in / talking about. (Look at a text area and then wave your hand to scroll it. Look at a button and tell Siri, “Press the button.”)

Note on iOS not having a Terminal: Apple most definitely does have terminal access to iOS devices for development and debugging purposes. End users don’t need it, but pro users do some advanced tasks by connecting to iTunes or something like iPhone Explorer and poking around inside app files.  Also, when a pro user wants to help a normal user there’s something called a “Provisioning Profile” that handles certain advanced settings (like reporting crashes, or just corporate IT policy settings).  So iOS is serviced by an array of external tools not accessible from within the OS / device itself.  This makes it harder to service and limits the power of the device since it can’t stand alone.

KitsuneStudios

On the other hand, what facility for creation of content could be lost? The answer should be none at all. So the real question relates to factoring technical information about the system and the means by which content gets created. iOS does that.

This is the point where I really have to disagree.

There’s the old saying in production; Cheap, Fast, or Good: Pick any two. The same can be said for features, complexity and production speed.

Every new feature needs a control to be added to the interface. Cluttered interfaces are complex, but burying them in a menu increases the time required to access that, even if it decreases the amount of time required to perform an action if the feature weren’t there.

Feature-rich creation software uses the extra buttons of the keyboard and mouse to trade complexity for speed. In exchange for learning many keyboard shortcuts and mouse commands, you can immediately access many commonly used commands without your eyes leaving the screen, or your cursor leaving the active area. That production speed is as vital as a fast processor and lots of RAM.

The trick for MacOS X is not going to be the elimination of complexity, but the management of it. How can Apple make that complexity more intuitive? To that extent, touch added to the MacOS X interface can help a great deal, since many commands are quicker and more intuitive with touch than the keyboard or pointing device. Actually eliminating the keyboard and mouse requires a system where we can duplicate that complexity on demand.

iJack

“Steve Jobs once said that desktops are like trucks. We need trucks to do certain kinds of heavy lifting, but they’ll become rare compared to passenger cars, used by most people.”

And yet – and without even a perceived ‘need’ – the Ford F-Series and the Chevy Silverado are the top two best-selling vehicles in the US. 

I’m not looking so build my biceps, so until one of Siri’s offspring can read my mind and do what I want, I’ll keep truckin’ with a mouse.  And yes, the pun was intended.

Peter

“Steve Jobs once said that desktops are like trucks”

If you live in the US, almost everything around you was delivered by a truck. Delivery by truck is a central organizing feature of our economy. The dissonance occurs when we try to compare truck operators to knowledge workers who create the apps, content, and network infrastructure that power our mobile devices.

While the focus of innovation and growth has shifted to mobile devices, PCs remain a central organizing feature of the mobile economy. I think we’re still struggling to figure out what this means. People who study pointing devices for a living know that the mouse is an incredibly good (Fitz Law) pointing device. It didn’t become so popular by accident. It beat out light pens, tablets, touch-screens, and trackpads due to its superior performance, compactness, and low cost. It’s only when you introduce mobility or multi-touch that the mouse looses as a pointing device.

Creating a touchscreen desktop without multi-touch software is lame, but that hasn’t stopped PC makers from discovering it didn’t sell very well.

Enjoy!

Bazz

Look at what you said and re-explain it in simple terms! Then you will know what gibberish you write.
First humans have not improved in 2000 years politically or emotionally.
Julius was assassinated in 44BCE just as JFK was in 63 and the dumbest president dumber that Ford who could not walk and chew gum as LBJ said let criminals stuff USA in the last ten years. And you paid $2,500,000 to keep him happily stupid in retirement last year with your taxes.

I did a communications course and we say stupid things and expect others to understand!

But you expect computers to give us the correct answer to stupid misunderstood questions.

A reliance on computers in the flying industry results in the auto-pilot throwing in the towel and letting the pilot crash because to find the cause needs hours not seconds

In Three Mile Island engineers could not envisage steam pressure pushing water out the emergency dump water reservoir onto the ground against gravity and in the minutes that was occurring believed there was no problem as the meters read full not empty!

BUT you John Martellaro think we are smarter now and know everything! OR at least Apple and Siri do.
iPox is just an inanimate Rainman that will lead us all to Kmart, opps App Store, for everything and we will be as upset as Google is at ogooglebar. ( I like Apple but I’m not one eyed)
John Martellaro remember everything has an ogooglebar point.

Log-in to comment