I Have Seen the Future of UI, and It Is Gaze

| The Back Page

Last week’s CES was a hoot and a half, though it included a realization about the host city, Las Vegas. Whilst walking to the monorail with Dave Hamilton and Jeff Gamet, it occurred to me that Vegas is a really weird combination of shouldn’t be and isn’t really.

Be that as it may, CES was a great time and we saw some really cool stuff, and one of the coolest isn’t available for any of Apple’s products or platforms—at least not yet. Tobii, a Swedish company, was demonstrating Gaze, eye tracking technology the company is working on positioning as a new paradigm for user interface on computers.

EyeAsteroids!

EyeAsteroids—Destroying Asteroids with Your Eyes!

According to the company, we’re still two years out from Gaze appearing on any shipping products, but they were demonstrating several different ways to use the technology, and all of them were quite interesting.

The idea behind Gaze is the ability to identify your eyes and then figure out what your eyes are looking at on a computer display. Gaze then decides what it is you are wanting to do according to that look and acts accordingly.

For instance, we saw a cover flow-style list of windows. By looking to the left or right of that list, Gaze would scroll through the list showing us other items in the list. If we focused on a particular document or window in that list for more than a moment, Gaze would open it for us, allowing us to use a keyboard and mouse to manipulate it. If we focused for a moment at the bottom of the screen, the file would be dropped back into the cover flow stream allowing us to find and open another document.

Another example was a game called EyeAsteroids, which you can see above. With this game, you are the defender of the planet Earth with the job of keeping a bunch of asteroids hellbent on destroying our home. Look at a specific asteroid, and an energy beam would launch from Earth that deflected or destroyed the asteroid. It was super cool, and if ever a game could make you feel like a super hero of some sort, EyeAsteroids was it.

Yet another example included using Gaze in combination with the space bar. Hold the spacebar down and Gaze was activated. We could then use our eyes to find a new document, and when we let of the spacebar, it opened and we were free to use the traditional mouse and keyboard to manipulate it.

Here’s a demo video of Gaze working with an early version of Windows 8:

All of the demos we were shown worked really well, but more importantly they were very intuitive. The eye tracking was astonishingly accurate in my mind and I think it could usher in some new concepts in the world of user interface.

Mind you, I’m not ready to toss out my mouse (or trackpad) and keyboard just yet, nor do I think concepts like Gaze could supplant touch interfaces on tablets. As an additional way to interact with computing devices, however, it has enormous potential, and that’s not even talking about its uses in vertical markets or with physical disabilities, incapacitating diseases, or even temporary mobility issues such as having to wear a full body cast. Those uses are obvious, and it’s terrific that Tobii (and the industry as a whole) is making advances in this field.

It’s what the legions of developers and hardware makers out there might come up with in using this technology that has me the most excited. Just as how Apple had no idea how many cool things developers would come up with for the iPhone and iPad when Apple opened those devices to outsiders, there’s no way Tobii can anticipate all the many ways in which future licensees might use Gaze.

Between voice controls (Siri), touch interface, and a technology like Gaze, we are truly entering into science fiction land when it comes to our devices. Where we’ll be in five years with all of these technologies is difficult to imagine today.

Today, Gaze is only up and running on Windows, and only on full PCs. It currently utilizes dedicated processing, but the company told me that in the future it could also use on-board processing from a PC’s motherboard. As the company perfects the technology and chip design makes further strides, it will eventually be able to operate without placing a heavy CPU tax on our computers and other devices.

I did, of course, ask if the company was working with Apple. A company spokesperson told us only that while she couldn’t name names, Tobii has talked with every major company in the industry. The last time I checked “every major company in the industry” would include Apple, so the possibility that Apple could license the technology is just that, a possibility.

I certainly hope this is the case, too. We are moving at breakneck speed into a land of new paradigms and ideas about computing. Voice is important, touch is important, and I believe that even keyboards and trackpads/mice will continue to remain important for some uses for the foreseeable future—but our eyes could also be an important element in the way we use, control, and manipulate those devices, and I find it very exciting.

Sign Up for the Newsletter

Join the TMO Express Daily Newsletter to get the latest Mac headlines in your e-mail every weekday.

Comments

geoduck

we are truly entering into science fiction land when it comes to our devices

Yes this is exceedingly cool. They’ve had things like this for handicapped people for some time, (Stephan Hawking uses one), but they are nowhere as slick as this. EyeAstroids is just a start. Imagine a FPS with this technology, or a Myst style RPG where you go where you look. Real game developers could come up with a thousand other ideas, each better than the ones I’ve thought of in 2 minutes.  Looking at something to select it is intuitive. Intuitive is what Apple does best. It should be a natural fit.

riscx

This was actually attempted by a company, (can’t remember the name, was in a Macworld issue), back in 1985 for the Mac to replace the mouse. What they found was it basically didn’t work for users. Response times were way to slow and it became increasing frustrating to “guide” cursers etc.

Some other observations were your eyes are not really “reliable” due to many factors such as constantly scanning off target, focus fatigue and other eye strain issues.

Would it be cool, definitely. Would it be reliable in real world uses, probably not.

geoduck

Some other observations were your eyes are not really ?reliable? due to many factors such as constantly scanning off target, focus fatigue and other eye strain issues.

It would be interesting to study the error rate factored against time of day. Would there be more staring off into space (and opening documents) late in the afternoon. How about the effect of glazed “not had my morning coffee yet” eyes.

It could be a fun study.

CudaBoy

Vindication!!! This forum posted the question about a year ago
as to what people thought a future UI would look like or entail.  I stated that eye tracking would be a no brainer a la jet pilots monocle or the like. I got lambasted by the non imaginative on this site. Nice to know I was correct and all the naysayers were dead wrong. He who doesn’t think outside the box - will stay inside the box, bankrupt.
Strikes me there should be a number of ways to track a pupil and mate the movement to a fixed matrix. Not rocket science, actually it’s jet science because they use ocular tracking all day long in fighters.

iJack

Geeze, CudaBoy!  Don’t break your arm patting yourself on the back.

CudaBoy

Geeze, CudaBoy!  Don?t break your arm patting yourself on the back.

Yeah, I’m thin skinned when insulted. That’s why I kill forum drones n clones for free, oddly my arm doesn’t even hurt. now back off iJ, you’re one of my heroes. d:D

D

Small correction, Swedish Company, not German. See www.tobii.com

Bryan Chaffin

Oops! Thanks, D. I corrected it. smile

Log-in to comment