Asteroids barreled towards Earth from every direction, and my only defense was my eyes. I looked at one asteroid. I looked at another. Boom. Boom. Lasers from Earth had blown them into smithereens. And I didn’t lift a finger.
At CES 2014, a dozen or more companies are vying to track your arms, legs, and even your eyeballs. This demo was from Tobii, an eye-tracking technology company from Stockholm that wants to change the way we read, drive, and game. The company makes Kinect-like sensors, which each have two or three cameras and sit below your laptop’s screen. After two minutes of calibration (which involved staring at several dots as they darted around), I could pinpoint any spot on the screen with my eyes.
Tobii’s “Eye Asteroids” demo was a blast, and was easier to control with my eyes than with an analog stick. Navigating Windows 8 with my eyes, on the other hand, wasn’t as much fun. In order to select an app, I needed to not only look at an app, but then press a keyboard button with my hand. I quickly learned that eye tracking doesn’t solve everything. In fact, the dozens of eye- and gesture-tracking companies at CES seem to be creating more problems than they’re solving.
But a few get it right. Now please, keep your hands and feet inside the car.
For better or worse, we’re still under the spell of Tom Cruise in Minority Report, who effortlessly moved through time and space solving crime using only his hands. Nearly every interface company I spoke with at CES mentioned the film — its iconic interface inspiring everything from car dashboards to operating systems you control with your fingers sprawled out longingly in midair. But hoping for Minority Report interfaces is a bit of a fantasy, says Dale Herigstad, whom Steven Spielberg hired to help design the film’s futuristic interfaces.
Naturally, Herigstad is showing off his own product at CES this year. It’s called InAir, a TV interface that layers information like sports scores and IMDB over what you’re watching. One would expect Herigstad’s interface to be operated with your hands, but he chose phones instead. His team started with Kinect, but realized that people are most often parked on the couch while watching TV and don’t want to wave their hands to change the channel each and every time. "If you look at what Tom Cruise is doing, it’s rather technical," he says. "A TV audience sometimes wants to have no interface at all."

Coming from the guy who helped design the PreCrime division’s iconic computer interface, I was a little disappointed. But he has a point: sometimes Minority Report holds us back more than it helps. I tried out HAL.TV, a new gestural interface for TVs that lets you wave your hand to open apps, switch channels, and turn up the volume. It worked poorly and I was exhausted after waving my hand over and over again to change the volume and switch channels. Not every device needs its own glorified gestural interface.
But some do. One of the Oculus Rift’s most annoying limitations is that it simulates your vision, but not your sense of touch. When you strap SoftKinetic’s DepthSense camera kit to your Oculus Rift, however, manipulating 3D objects in space becomes a reality. I tested out SoftKinetic with a very rudimentary building-block program that let me grab blocks (with both hands at once, even!) and stack them in a 3D grid world. I felt like Johnny Mnemonic moving effortlessly through cyberspace. I couldn’t feel the blocks I was touching, of course, but I think that’s an acceptable limitation — for now. Within a couple years, this is how we’ll all play Minecraft.

Without a multicamera sensor on your head or on your computer, gesture and movement recognition is much more difficult. But always-on cameras suck power and aren’t feasible for mobile devices. While other companies fooled around with cameras and parallax, San Francisco’s Elliptic Labs has been busy tweaking its ultrasonic wave technology. Like comic hero Daredevil, Elliptic Labs’ tiny speakers constantly fire off ultrasonic waves that bounce off your hand, then bounce back into your phone, tablet, or computer mic. Then, the company’s software determines the location of your hand (or face) in space. You can scroll, switch apps, change songs, and answer calls with a wave of your hand. Most incredibly, the technology works even if your hand isn’t hovering over your screen like you might do to answer a call on a Galaxy S4. You can hold your hand a few inches to the side (or below) your phone, and gesture just as well. I was shocked the first time I tried it and it actually worked.
Within a couple years, this is how we’ll all play 'Minecraft'
CEO Laila Danielsen says that the company already has deals with several OEMs in the works, which means phones and tablets could have more accurate versions of Samsung’s limited-range Air Command as soon as next year. The toughest part will be explaining to consumers why they want to wave at their phone. Using your hands to quickly answer calls and change songs in the car seems like the most valid use case. Tobii also targets the car for its eye-tracking — an opportunity it sees as perhaps its most lucrative. With the company’s sensors mounted above the steering wheel in every car, you can signify that you want to make a call without taking your eyes off the road. The company’s demo also lets you change radio stations with your eyes and lets you know if you’re drifting off. A few car companies have already implemented such technology, but Tobii plans to take it mainstream.
Once the iPhone launched, Herigstad says, people began to understand literal, direct manipulation with digital objects. It may have taken a few years, but today, even toddlers seem to understand the mechanics of pinching and zooming on an iPad. In order to push gesture-based 3D interfaces even further into the mainstream, Intel plans to integrate its RealSense 3D technology into a growing number of its partners’ laptops, tablets, and desktops within the coming months. RealSense works very much like Kinect, giving your computer not one but two eyes. This means your computer will be able to detect 3D objects like your hands or an object you want to scan into your computer. I demoed a whimsical music-making app that let me play virtual guitar, piano, and drums by moving my hands in the air. I could grab different instruments at will, combining them or tossing them off-screen. Intel’s demos weren’t very impressive, but they herald a future where every computer has a 3D camera inside it.
Just because we’re obsessed with 'Minority Report' doesn’t mean that’s how our future should look

At the end of the day, my hands and eyes were tired. Full-on gestural, eye-tracking interfaces are exhausting because they demand that you interact with your computers and TVs, when ordinarily I’d be lazily slouched into a chair with a remote or Xbox 360 controller in my hand. In testing many of the hottest interactive technologies, I discovered that these devices are the future — not the future of everything, but the future of driving safety and the future of gaming. And perhaps they're the future of other things, as soon as appropriate gesture-based interfaces are developed. These gestures might help doctors move an x-ray from one display to another without using their hands, or simply help you wave to close your garage door. Tobii already has 15,000 handicapped people around the world using its products to type messages to family and friends using only their eyes.
Just because we’re obsessed with Minority Report doesn’t mean that’s how our future should look. PreCrime’s sensational computer interface is almost impossibly technical — and wouldn’t be useful for most games, jobs, and tools. But Spielberg’s effects crew was right about a few things — interfaces that didn’t just look good but also felt right. "We have grown up with media that has flattened 3D and we have gotten lazy," Herigstad says. "Our eyes have gotten fat, and our eyes don’t have to refocus things." Minority Report, and more recently Iron Man, introduced the masses to direct three-dimensional manipulation, which will require a bit more work on the part of the user — but it will yield some pretty amazing and useful technology. And you won’t even have to be Tony Stark or John Anderton to use it.