Skip to main content

Leap Motion's revamped hand tracking is getting built straight into VR headsets

Leap Motion's revamped hand tracking is getting built straight into VR headsets


Welcome to the uncanny valley of motion control

Share this story

When Leap Motion was announced in 2012, it seemed like a solution looking for a problem. A little black box that plugged into a computer, it could recognize hand gestures and translate them into interface commands — to let you "do things on your computer just like you do them in real life." But the company was trying to sell people on something they’d never asked for, for use with a machine they’d already been interacting with for years.

Then came virtual reality. When people first started using headsets like the Oculus Rift, the mouse and keyboard suddenly became inadequate: they were tough to find while effectively blindfolded, and they didn’t take advantage of VR’s unique feeling of 3D space. Suddenly, tools like Leap Motion started to make a lot more sense.

The company has shifted more and more toward VR in recent years. But today, it’s making the move official with a project called Orion: a revamped tracking system tailored for VR motion, and a small sensor that can be built straight into virtual reality headsets. If things go according to plan, the sensor could start showing up in products as early as this fall. And while this hardware is meant for manufacturers only, the software is available today in beta for its current Leap Motion controllers.

Leap Motion’s creators say that as they spent more time in VR, they started feeling hemmed in by their original focus on desktop computers. "There’s a lot of things you want to do in virtual reality that you don’t do on a PC," says co-founder David Holz. On a laptop, for example, users generally wouldn’t move their hands more than a few inches. In a virtual landscape, they might stretch them all the way out to reach an object, something the Leap Motion wasn’t prepared to handle. The new software attempts to broaden its scope while adding overall improvements, like identifying fingers more accurately and tracking them more quickly.

Holz and his co-founder Michael Buckwald won’t reveal exactly how many of the $79.99 Leap Motion controllers they’ve shipped since launch in 2013; their site currently lists 207,000 developers and hosts 227 free and paid apps. It’s also supposed to be appearing in a special faceplate for the OSVR headset, but that has yet to see release. "We’ve sold a lot," says Buckwald.

"Input is an almost existential threat to these companies."

Whatever the number, Leap Motion remains a relatively niche product even within the world of VR. So far, that’s true of practically all input devices. The high-end HTC Vive and Oculus Rift will eventually use sophisticated motion controllers, but those are still months away from release. Smartphones will usually pair with a Bluetooth gaming controller, but few users want to carry one around. So mobile headset makers often end up using the single-button model of Google Cardboard, or no control system at all. "Input is an almost existential threat to these companies," says Buckwald, and hand tracking is the solution.

In the second half of 2016, particularly during the holiday season, Leap Motion expects "several" VR companies to release headsets that incorporate its Orion sensor. It’s unclear which ones, but apparently "not all of them are US companies," and some moved into VR after seeing Facebook acquire Oculus for $2 billion.

The biggest names in VR right now have already put significant work into motion tracking. Google, which is expected to release at least one non-Cardboard headset this year, has already devoted years to its Project Tango augmented reality platform. Oculus has bought multiple tracking tech companies, including the Leap Motion-like Nimble VR. But Leap Motion could be talking to indie headset makers like Fove, or to Chinese internet juggernaut Tencent, which is supposed to launch a VR headset later this year.

Leap Motion

The bigger question is whether Orion can deliver a reliable and natural input system. My past experience with Leap Motion has been a combination of excitement and frustration: it’s a great system when it works, but it fails just often enough to undercut its value as an interface, like a mouse button that misses every 15th click or a touchscreen that can’t always find my finger. To its credit, the system has gotten consistently better over time, and I’ve never used it long enough to really settle in. But the promise of a natural user interface is that it’s supposed to adapt to your motion, not the other way around.

The demo Leap Motion is showing off now is called Blocks, a featureless landscape where you can conjure cubes (as well as rectangular boxes and dodecahedrons) out of thin air by pinching and pulling. The motion makes total sense, like stretching invisible putty. The tracker mimics your hand almost perfectly across a wide field of view; I accidentally reached outside it only rarely. Once the shapes are down, you can grab, throw, and push them, or turn off gravity altogether and bat them around. Objects don’t collide with your hands perfectly, but I could recognize and adapt to the quirks more quickly than in my earlier demos — and the problem is potentially more in the Blocks app itself than the underlying tracking system.

It's still disorienting to grab at something you can't feel

As we found out nearly three years ago, though, Leap Motion has a huge inherent weakness: the lack of tactile feedback. No matter how good the tracker is, it’s a little disorienting to grab something and get only a visual confirmation that you’re actually holding it. The same is true of their first steps toward a Minority Report-style floating interface. Right now, turning your hand palm-up in Blocks brings up a three-button shape selection menu. The idea is clever and well executed, but it’s hard to get used to not feeling anything — whether it’s a computer key or a touchscreen — to confirm that I’ve interacted with a machine.

Controllers like Oculus Touch also let you grab, draw, and push things in mid-air, but they add a layer of both abstraction and feedback. You don’t expect a remote to literally act like your hand, and you get the reassuring feel of hitting real buttons, even if they represent a totally different action in VR. Leap Motion is in the uncanny valley of motion control: it offers interaction that is sometimes strikingly realistic, but distinctly not real.

Despite its flaws, if Leap Motion makes it into mobile headsets, it will still be the best mobile VR interface I’ve seen, from a simple Bluetooth controller to Microsoft’s HoloLens "air tap." Whether that translates into real success depends largely on how well the headsets using it sell — and what secret interface projects VR’s biggest players, like Oculus and Sony, might have in the works.