Oh, What We Could do With a Mac That’s 1,000 Times Faster

| John Martellaro's Blog

Over the decades, personal computers have made enormous gains in speeds. But in the short term, not much has happened. On the other hand, the maintenance burdens on customers just kept increasing. That's why customers have moved briskly to the tablet. In a sense, the PC industry failed its original vision, and customers moved on. Now what?

What could we do with a Mac that's a thousand times faster than Macs today? We may never get a chance to find out. The vision of a very, very fast personal computer that can do the things we've always dreamed about is fading fast.

The reason the dream is fading is that while it takes hundreds of people to design a Mac and hundreds of people to develop and maintain OS X, the market economics dictate that only a relatively few people in a group can make a living selling Mac software.

The Mac App Store has made the problem worse. There was a time when we weren't too unhappy to pay several hundred dollars for the considerable capabiities of the MS Office suite. Nowadays, if an app in the M.A.S. costs more than US$10, we blink. There's no money to be made by the exploitation of the hardware, only exploitation of the customer.

On the other side of the coin, our Macs are increasingly complex yet enduringly stupid. For example, there's no glimmer of an AI agent in OS X than can monitor the S.M.A.R.T. status of the internal drive. As a result, in the year 2013, what generally happens is that a customer hears the internal drive start to make noises, and soon after that, 2,000 family photos are lost forever. (Unless an automated backup system was in use.)

In general, the burden of PCs & Macs became too large. Operating systems with 50 million lines of code did a lot of graphics and system management, but not much in the way of interacting with the customer -- except through cryptic error messages, system logs and mysterious failures. The dream of our PCs having an intelligent conversation with us about anything vanished, replaced by the headaches of corrupted disk directories and malware.

And so, modern customers said to themselves, "We're tired of viruses. We're tired of the care and feeding of rotating magnetic platters. We're tired of corrupted directories. We're tired of lost files, noisy fans, lost software licences, misbehaving printer drivers, antivirus updates and endless housekeeping chores."

And what we've received in return is a tablet with a fraction of the power our desktops have. iPads are fabulous, but it's like turning back the speed clock 10 years.

What Might Have Been?

If, say, Apple had followed the most authentic dream of personal computing and if our copyright system hadn't gone berserk, things might have been different in an alternate timeline. Here are some examples I can think of with a 500 teraflop scale computer that has the corresponding software. Software, of course, that major PC makers could have developed, but never did.

1. Today, we have to wait for someone else to turn our favorite books into movies. Much of the time, the work is unsatisfactory.

Good morning. I'd like you to work on a movie. Take "The Song of Scarabaeus" by Sara Creasy, the Kindle edition that you already have, and convert it to a movie for me. Three hours max. Use the standard starship interior templates. Use the standard planetary terrain templates. Use the author's description of the lead characters, Edie and Finn. If you have any questions about anything else, just ask. Can you have that ready by tomorrow?"

2. Today, we do manual Google searches and hope that what we're looking for comes up on the first page. We can do better.

Good morning. My sister-in-law is having a health issue with X. I'd like you to scan the Internet, look at all the major medical research journals posted online. Say, the top 5,000. Download the genome I have for her on file and analyze the most definitive findings and tell me what's being said about the best treatments for X."

3. Today, we depend on primitive firewalls and malware detection software to work, blindly, without context, to protect our computers. What if our AI agents in our Macs could carry on a conversation with us? What if that AI agent had a visible human form? Why is it that only the video games, for example, Xbox games, that are full of violence and death get to have the best renderings, visual representations of humans? By that I mean that these games have immersive action, realistic scenes, and character dialog. But my Mac? It just stares at me stupidly with a static desktop. What if my Mac had a name and an active personality?

Good morning Cynthia. What's happening these days?"  - Cynthia: "Last night, there were five denial of service attacks. I blocked them. You got 112 spam emails, defined by my current blacklist, but one was from a developer you wrote to on Monday, so I figured you'd want to see that. Your Flash drive wasn't performing very well, so I rearranges some bad blocks. Oh, and you have a dentist appointment at 2 pm today. Don't forget."

4. Education is area where we could really use some computer assistance. AI agents that can teach specific things in an interactive way are always being worked on in university settings, but economics and special interests seem to have kept a major AI effort away from home schooling.

Good morning Cynthia. My son Paul is home from school today. It's a snow day. He's been having problems with factoring polynomials. Can you work with him on that? And when you're done, as an incentive, maybe you can work out some fun games to play. But no first person shooting today, okay? And make sure that patent for his genome gets filed."

Cynthia, in an attractive but modest human teacher form, goes on to interact with Paul, as a holographic representation, writing equatons in the air, demonstrating first and then watching Paul work on simple examples. Later, she'll report his progress.

All of these examples require special effort by a corporation, not individuals.To carry the argument to humorous extremes, no developer living in his mom's basement, charging $5 for a Mac app, is going to achieve this level of software development. We're stuck. But is this kind of effort and vision that Apple should be about?

Steve Jobs thought so. Check out his own vision of how we should be using the power of a computer to interact with the human, not just sit there and do Visicalc number crunching. The video is worth watching from the start, but if you wish, jump ahead to the 11m45s mark where Mr. Jobs notes that ""Visicalc runs fast enough." What's next? His vision for using the hardware horsepower of the computer to serve the user is spot on.

All this may never come to pass however. We're too focused on our tablets now. The Post-PC area is in full swing. There's no more money, no more growth, no more opportunity there unless a company like Apple renews the original vision. Our tablets are devices of limited CPU and graphics power. It may take another 10 to 20 years before our tablets are able to do the things I've fantasized about above. If ever.

That's why the Mac Pro must continue as Apple's commitment and inroad to this enduring vision.

Right now, the fad in mobility is to exploit the user for economic gain, like Facebook Home,  instead of exploiting home hardware and software for the user's gain. Perhaps, someday soon, the vision of Mr. Jobs can prevail again at Apple.

Oh, what we could all do with a Mac that's a thousand times faster.

______________________

Song of Scarabaeus, © copyright 2010, Sara Creasy.

Futuristic images 1, 2, 3 via Shutterstock.

Sign Up for the Newsletter

Join the TMO Express Daily Newsletter to get the latest Mac headlines in your e-mail every weekday.

20 Comments Leave Your Own

Lancashire-Witch

Almost forty years ago a large IBM/370 seemed perfectly adequate for running the payroll and sales statistics systems - we even managed to do some quite intensive calculations for production planning, sales forecasting and inventory management. (Anyone remember COPICS ?)

For most of us, most of the time, most of what we wanted to do was possible with the computers we had.
Visicalc did indeed run fast enough.

Today, if all you want to do is email, browse the web, and socially interact with friends and relatives through Skype, Twitter and Facebook how much (more)  computing horsepower is needed?  Even adding in occasional “Office” tasks doesn’t require a faster computer. Music and photo management work well.

We are in the age of computing for the masses - and it isn’t real “Computing”.

ipaqrat

I have similar philosophical debates with my coworkers all the time: “Why can’t the software developer do X?”  “Why is this so difficult?”  Here in our own IT paddock, in the company only of other technosociological cyberfetishists (geeks), we are like hogs in our favorite slop. We happen to know which corn cobs complement fetid Samsungs and worm-eaten Apples. 

It’s easy for us to forget that the utter mass of people, with their wildly divergent tastes, tolerance and skill sets, all average out to steadfast mediocrity. It’s not the cost of nice software that limits sales (and therefore development) potential. Thing is, most folks don’t care about nice software, at any price, even if they are able to distinguish it.  Folks that care about MS Office’s arguable superiority will acquire it, whatever the cost.

You may as well lament that Toyota outsells Porsche. Though in an odd turnabout, Apple now feeds bland, bleached bread to the proletariat in the amphitheater where some chick once threw a hammer at the movie screen (how rude!).  And you can’t take a poll about it here on Mac Observer because the proles ain’t here.

Dave

I have had a program on my old G4 tower which monitored the SMART status of the HD and alerted me if problems for years.

John Martellaro

Dave: Sure. I’m running one on my Mac Pro.  SMARTReporter.  That functionality should be part of OS X.

http://www.macobserver.com/tmo/review/failing-drive-smartreporter-is-a-great-diagnostic-utility

Paul Goodwin

Sad but true article. Computers have been fast enough for years. No wonder the market is down, not many people need anything faster, because there’s nothing startling enough in new technology to make the need come back. Increasing Web page complexity drove the computer speeds for a while, but it’s complexity and functional growth has slowed. I think part of the problem with the high dollar software too is/was that they almost always come with an extremely steep learning curve, and to much un-intuitiveness. The I-want-it-now mentality doesn’t help either. And the money investment plus the time investment for effective and productive work on the high dollar software is just too much. Maybe it’s time to put AI to work figuring out what’s intuitive and what isn’t. Pick the brains of some human lab rats for some months, and use AI to create easily learned software. The application would present the user with some basic questions, determine if they were left or right brained and put each user into a bucket. And in that bucket are menus and actions, and yes maybe verbal communication tailored to his or her way of thinking and using a computer. UI adapter software, layered over the core function of the software.

skipaq

For what I need my five year iMac is plenty fast. I could still make do with my G5 tower. But my forty year son uses it. The way things are going I may never need to buy another Mac. The type of computer John speaks of is something I my not live long enough to see let alone drool over.

Tony

Great article! Now that we’ve reached the critical threshold where we can stream and download endless video, use facetime and our phones make workflow nearly seamless, a threshold of sorts has been reached.

We need to become restless again.

John Martellaro

Tony:  Oooh, I like that word restless.  It didn’t appear in the article. Shame on me.  Thanks for summing up a big idea in one word.

Hagen

I find two things have interfered with that vision of the future:

1) The fuzzy “human” logic has turned out to be far harder to simulate/build than we figured. We base most of our higher-end thought processes—including natural speech and context—on that. Until that can be done with a computer smaller than Watson, there’s too big a roadblock on the Truly Great Things.

2) We don’t trust the computers. Everything you mention could be done as long as we’re willing to rely on the decisions and directives that programs could come up with. Instead, so many programs and the companies behind them have made us wary of anything offered to solve such problems, wondering what other ulterior motives those programs/companies might have. (Example: After 3-4 spectacular mis-directions in car navigation, I will not trust any company to tell me how to get from here to there. I use their maps, and figure it out myself.)

Much like Tony, I’d say the Truly Great Things will require people with the optimism to dream of them, the intelligence and wisdom to make them happen, and the morals not to chain them to some underhanded selfish desire for power/profit.

iJack

Hang on a minute, John.  Aren’t you one of the TMO fellas that hollered, “Yay!  The iOSification of the Mac is coming?”

I’ve been waiting for the Mac you described for nearly 30 years, only to watch it turn into a diminished appliance, maybe even a toy. 

It pisses me off.

JonGl

I don’t like your version of the future. Let’s put it in these words. What purpose people in your future? Computer AI does everything, and we do nothing? Even the creative processes of creating a movie and teaching your child are taken over by artificial intelligence. Boring. What makes human human is our quirks—mediocrity a commentator called it—but it’s not really just mediocrity. It’s what makes us human—flaws and all. AI will always have one problem—it won’t be truly human. Sure, movies and computer games like to make AI seem better, smarter, cuter, funnier, more intelligent than humans, but the reality is, no matter what, it won’t be human. It will be machine—and it doesn’t matter how good it is—it won’t be human. You may like that, but I don’t, and I think that if you really got what you wanted, you wouldn’t like it either. Let the machines remain the servants, doing nothing more than what I actively tell it to do, and not the master, telling me what to do. The more computers do with less input from us, the less human we are, but not the more human they are.

I’m sure what I’m saying sounds all garbled and messed up, but maybe somebody here will understand…

Andhaka

@JonGI: yeah, I get it. I wanted to say the same exact thing. A future where movies are commissioned to my my computer to be “created” without even some input from a human mind is not a future I’d like to live in.

I understand the ease of use, but there’s a threshold that should not be crossed where too easy means we don’t know anymore what we are doing. What happende to a future with people with skills and imagination? Now we tweet our lunch and bowel movements and accept to have desktops with no upgradable RAM o HDD cause “it’s difficult!! Mommy!!!”.

Come on!! Computer are tools!! People should be the ones using them in proactive ways!! Not only ordering them around like some sort of do-it-all genie.

Cheers

iJack

“Boring. What makes human human is our quirks—mediocrity a commentator called it—but it’s not really just mediocrity. It’s what makes us human—flaws and all.”

Bah!  Simpering rubbish.  Look where our quirks have got us in the last 10 millennia.  We’re still slaughtering each other in the tens of thousands, while millions of others are dying of disease and starvation.  Our political systems groan under the weight of their own stupidity.  We’re consuming the life out of our planet, and doing nothing to fix it.

In any event, who said a computer is supposed to be “truly human?”  It’s a machine; a tool, and I want my tools to be as efficient and as easy to use as possible.  I don’t want a machine with “quirks.”

The thinking in your post is reactionary, Luddite nonsense.  Get a grip.

nich

My future computer will include portions from most of these comments. I like the intelligence and alternate interface of John’s article, but it should not be human. It should be smart and responsive and adaptable but not human! Humans make way too many mistakes! I’m hoping that the interfaces of the future Will be smart and adaptable to the individual. 
  For instance, my computer use is probably extremely different from most people visiting here. I am currently combining the voiceover screen reader, with verbal commands through speakable items, and using the dictation service, with a healthy dose of voice activated AppleScripts to aid me in writing this comment.  I have every English voice installed, and I use three or four of them actively all the time. Each one describes different things to me on the screen.  Future computers will need the extra power, because people like me will continue to bring all of the pieces together into a new interface. I have always like Apple products because they supply so many alternative pieces as part of the unit. 
  So, a computer that is human? No thanks! A computer that can relate to humans, definitely! grin

iJack

@nich Curious. How do you “use three or four of them actively all the time?”  My Mac with the latest OS only allows me one at a time.

Tony

iJack said: Bah!  Simpering rubbish.  Look where our quirks have got us in the last 10 millennia.  We’re still slaughtering each other in the tens of thousands, while millions of others are dying of disease and starvation. 

I would beg to differ unless you have an unusually broad definition of “quirks.” What you are describing is by design, aided by increasingly adept technology. The lack of moral and ethical framework for technological development and deployment can be factored in as just as fault as the failed ethics that drive policies and said destruction.

Here’s a juicy quote: “every tool can be a weapon, if you hold it right” - Ani DiFranco

Tony

An out of control AI and loss of humanity has been the fodder for scifi forever. Interesting chronology here: http://en.m.wikipedia.org/wiki/Cybernetic_revolt#section_4

My field is appropriate technology, largely around construction systems, and some evidence of what worries JohnGI has already come to pass. Turning so many different tools into “guns” changed construction and not all for the better. Pulling nails out of old studs was a nearly therapeutic - if time consuming pastime. It demonstrated respect for materials and skilled labor. With nail guns, studs are destroyed as they are attached and impossible to reuse. The entire construction site has become more violent, and the impact upon the earth, the workers, and the residents has deteriorated.

What parallel in computing? Not sure - on one hand it’s as if I’m some kind of techno savant to my non-computing friends; on the other, my runaway needs for space for increasingly large digital files and the 500+ apps I have is more than a little disconcerting. I long for two things more than computing power: 1) stable, fast, infinitely expandable storage, and 2) quantum leaps forward in power storage and generation capacity. Those two will directly serve my productivity, with some advances in computing power helpful. But better UIs will do more for me than say doubling current processor power. I think that is the leap tablets and phones have shown us is desired: integration of our computing needs.

Lastly, I am keenly aware of the concerns raised around technology. It is dehumanizing in the most basic sense. We are more disconnected from nature - our true life support system - than ever before, and the curve is steepening. I like the movie by Wim Wenders - Until The End of the World.
http://www.imdb.com/title/tt0101458/?ref_=sr_1
In a truly cautionary tale with tech perhaps on par with John’s musings, we see what I feel was not only the mobile device that inspired the iPhone but what tech addiction can do unhindered… and the hope on the other side.

My motto: just because you can, doesn’t mean you should.

wab95

John:

Another thoughtful article, in which you’ve touched on disparate issues under the one theme of a faster and more capable computing platform (I think you’re describing more than just the computer hardware - see below). It is these disparate issues, in my opinion, to which readers are responding with mixed reaction.

I see three distinct discussion points in your piece, 1) speed (or simply raw horsepower, storage and functional capacity; 2) OS functionality (with an AI at its core); 3) the nature of the human/computer interface (and how human or not it should be). While all of these can be housed under the umbrella of the computer’s long march from calculator to companion, they are sufficiently independent that each can be a standalone dissertation (don’t worry, that’s not where I’m headed).

Speed/horsepower: a huge factor in limiting how far we have come with this is our own behaviour, or the human factor, less so than the available technology. To an extent, these have been shaped by both the form factor of the original PC box and our initial interaction with it. The limits of our imaginations, and what we now culturally accept that a computer should/should not do have proscribed those interactions to the limited tasks that most computing experience is today; word processing, email, web browsing. For others it might expand to video and music, production and consumption alike. Then there are a few fringe uses (e.g. science), that are too few to comprise a driving force. By and large, we have limited our computers to communication and consumption devices. Those functions, we have now discovered, are easier to do on a less expensive device - tablets (and smartphones). Little wonder then that we see a substantial drop off in PC sales. Consumers are simply following basic microeconomic theory. Does it diminish the benefit/cost ratio of developing that more powerful PC (macroeconomics)? Yes, substantially more so as sales fall off. The evolutionary driver here is, as it is throughout the animal kingdom, behaviour. If we are using computers to do X, then we should not decry our creativity to find a more efficient, cost effective way of doing X. Behaviour dictates the form of the computer, and not the form of the computer human behaviour.

OS/functionality: The drivers behind this, to my modest understanding, are complex and multifactorial. You yourself have discussed many of these in past blogs. A substantial amount of research in universities, government sponsored entities (e.g. military) and more limited private sector has gone, and continues to go, into this. One standard that has consistently been applied to the designation ‘intelligence’ has been the Turing test, that is, the capacity of a machine, in this case a computer algorithm, to exhibit behaviour comparable to, or indistinguishable from, human behaviour. The objective here is neither to create an artificial human nor to supplant humans, but to adapt computers to us so that we can interact with them based on millions of years of our own, hard wired, behavioural algorithms. In this way, learning curves are flattened and productivity is increased. Substantially. These efforts have been repeatedly tripped up by our rapidly developing understanding of how our brains work, taking us time and again, back to the proverbial drawing board. Much of this discussion, involving such esoterica as neural nets vs linear circuits, lies well beyond this discussion. Suffice it to say, there are numerous highly intelligent human brains working to solve the problem of replicating Nature’s solution to learning, problem solving, social interaction, pattern recognition and the like.

Nature of the human/computer interface: As a species we are consistent in our discomfort with profound change. The awkwardness and even frank revulsion in dealing with something that is trying to simulate human but is not quite human has been described in primary literature, and can be researched by any who care to read it. In short, this is likely to be a gradual and evolutionary development of an interface that responds very human-like, for the reason discussed above, namely adapting the computer interface to us rather than us to it, as we currently do, in order to maximise our productivity and ease of use, with likely the option to attenuate the degree of human simulation in that interface, to individual comfort level and taste. Authors such as Jack McDevitt have already described a future in which the AI assumes human form and interacts with us as would another person, and can assume different forms according to situation. A generation beyond ours alone will have evolved to exploit this in much the same way that a later generation than grew up with the horse and carriage was required to ride comfortably in jet-powered aircraft. It will happen because we need it and wish it so. If not us, then posterity.

Okay, maybe it was a dissertation, but a short one.

nich

@ iJack: a bit off-topic,

  The Voiceover screen reader allows use of up to five voices simultaneously. I use 3; Alex, Serena, and Tessa. Alex is used as the main voice. The additional voices read things to me other than content. For example, Alex will read the label “Voiceover on”, then Serena will say “radio button”, then Tessa will say “selected.”
  I also have my system voice set to Alex and sometimes I have alert dialogs spoken by Tessa. Her custom message is phonetically warped to sound like a female troll in WoW. “Yah mon, look here at dis!”
  I also use multiple voices in my own AppleScripts, which I make “speakable” as well.
  Back to topic, sometimes some of the future UI pieces are in the computers that we already have. Likewise, sometimes a future interface is already in front of us. grin

Bazz

My favorite bit of information theory trivia is Wolfram’s last (?) two books—one a small 100 page book on the mathematical algorithm to find “the meaning of life and everything” using automata theory that his Mathematica is based on. The program will work but to run the universe program takes as long as the universe is old or there abouts - the reason is independent variables.* He was upset, but his next book was also on “the meaning of life and everything” - 600 pages of crap.

JM become a fiction writer or stay in the real world where its impossible to be perfect or have perfect knowledge!!

* And your Mac needs to be 1,000,000,000,000 faster otherwise the answer comes too late—somewhere in 2023!

Log-in to comment