If you’ve ever used the Leap Motion hand tracker, you’ll know that it’s a magical experience punctuated by moments so frustrating it makes you want to throw things — except that you can’t pick them up, which is why you’re frustrated in the first place. Leap Motion is well aware of this problem, and the company has just announced a beta of what it calls the Interaction Engine, which is meant to fix it.
The Interaction Engine is an add-on for the larger Unity game engine, which supports just about any VR headset. It modifies the default physics rules to create more natural interactions with players’ in-game hands, including touching, picking up, and throwing objects. On a very broad level, objects without the Interaction Engine want to bounce away from each other when they collide, even if that collision is between your virtual fingers and the item they’re trying to grab. With the Interaction Engine, they’ll theoretically stick to your hand better, and you’ll be able to do things like stack them more easily. There are also custom physics settings for throwing items, and if you try to push objects in a direction they can’t move — like against the floor — they’ll phase through your hands, instead of trying to (as Leap Motion puts it) "violently escape."
Developers with a Leap Motion controller can test the Interaction Engine with a sample project that’s compatible with Oculus Rift and HTC Vive. For the rest of us, the announcement blog post features a few example GIFs. Here’s one, without the engine turned on:
Now here’s the engine in action:
This is an extreme example — I’ve used earlier versions of Leap Motion, and it was absolutely possible to pick things up, even if you had to be careful. But even slight problems can make using it feel unnatural and inconvenient. Even now, the demo is pretty much limited to one- or two-inch cubes and spheres; Leap Motion’s documentation warns that the engine has trouble with "smaller and elongated objects in particular."
The Leap Motion is still a fairly specialized kind of controller, compared to things like the Vive’s motion remotes, which only have to detect the click of a trigger instead of the fine movement of an entire hand. But you can still find VR experiences that use it, including the impressive Void "hyper-reality" headset. Hand tracking may be the ultimate form of natural interactivity, and building virtual worlds that can accommodate it better would be a real step forward.