Skip to main content

Self-tracking headsets are 2017’s big VR trend — but they might leave your head spinning

Self-tracking headsets are 2017’s big VR trend — but they might leave your head spinning

/

Important doesn’t mean comfortable

Share this story

I’ve gotten used to VR headsets becoming constantly lighter, more powerful, and easier to use, as companies tried to turn a piece of sci-fi tech into a viable product. Several of these shifts have happened at Las Vegas’ annual CES trade show, offering hints of what we’ll see in the coming year. But at CES 2017, some of the most important VR projects were downright unpleasant — and that may not be a bad thing.

In the fall of 2016, Microsoft VP Terry Myerson took a pointed swipe at the year’s freshly released batch of virtual reality headsets, casually dubbing them “less immersive accessories” compared with a new Microsoft project. Why? Because, Myerson claimed, Microsoft had perfected inside-out tracking: a nascent technology that could make VR vastly more convenient and accessible, by getting rid of external tracking systems. Now, inside-out tracking seems set to become the big VR trend of the year, though it might still make you a little dizzy.

Inside-out tracking isn’t just a geeky specs bump; in fact, it could be one of virtual reality’s great equalizers. High-end headsets currently use external cameras or sensor setups to let people move around in VR. More affordable mobile systems, meanwhile, have no positional tracking at all. Inside-out tracking inverts the traditional setup: you put sensors on the headset itself, and they read cues like depth and acceleration, translating it into virtual motion. So high-end users wouldn’t need a camera-studded VR room, and mobile users could do far more with their headsets; they might even get less motion-sick in the process.

In order to prepare for a big leap forward, VR had to take a step back

Unsurprisingly, Microsoft wasn’t the only company interested; Oculus, for example, had already introduced an inside-out prototype called Santa Cruz. And at CES last week, the idea seemed to be taking off in earnest. Chip maker Qualcomm featured inside-out tracking on mobile headsets with a Power Rangers movie tie-in. Intel showcased it in a new version of its “merged reality” headset Project Alloy. Smaller companies like Pico and uSens appeared with everything from algorithms to near-finished hardware. A handful of major manufacturers, including Lenovo and Asus, revealed VR headsets they’d been working on with Microsoft.

Many of these products are set for release later this year, making them more than an early experiment. But in order to prepare for a big leap forward, virtual reality had to take a few steps back — to the days of clunky prototypes and rough tech demos.

Qualcomm’s tracked mobile headset
Qualcomm’s tracked mobile headset

For all their flaws, existing high-end VR systems are great at creating small worlds you can walk around naturally. But self-tracking headsets often haven’t reached that point. Unfortunately, making a VR headset that works on its own in an ordinary room is tough. It’s one thing for an external camera to find an LED embedded in the Oculus Rift, or for sensors on a headset to pick up on clear external markers. (The Vive actually uses inside-out tracking, but it only works with two carefully placed laser towers in the area, which doesn’t offer much freedom.) It’s another to have a computer look at the world and calculate position based on nebulous signals like depth and acceleration, without a fixed point of reference.

“It’s like the invention of the mouse.”

Still, for Youssri Helmy, CEO of machine vision company Eonite, inside-out tracking is a non-negotiable part of VR’s future. “It’s got to happen,” he says. “We think it's crucial. It's like the invention of the mouse with the GUI.” At CES, Eonite demonstrated 3D mapping software that headset makers can use to add inside-out tracking, as long as they include a depth sensor and motion-detecting inertial measurement unit (IMU). Helmy thinks that this kind of tracking is almost always preferable to external systems — “We as humans are inside-out only, right?” he quips when I ask about it.

Eonite’s CES demo was innovative, but limited. Using sensors taped to an HTC Vive — without the laser tracking boxes — I could walk around a living room with the same rough dimensions and furniture placement as the real hotel suite. If I looked straight ahead at the walls, it felt almost as good as normal Vive tracking. If I looked down at the floor, though, the headset would lose its place and jolt me several meters away. Helmy says the demo was using a narrower-than-usual camera — something with a wider field of view would have solved the problem. But it was part of a running theme at the show: every inside-out, work-anywhere tracking system I tried involved some kind of awkward and revealing compromise.

Eonite on an HTC Vive
Eonite on an HTC Vive

The three companies that seem best poised to crack inside-out tracking are ones that didn’t actually show it off at CES: Oculus, Microsoft, and Google. Oculus’ self-contained headset is the best I’ve ever tried; whether because of technical prowess or just well-controlled demo conditions, I never felt like I was sacrificing performance for freedom. Microsoft put excellent room-mapping capabilities into HoloLens, so it’s in a strong position to do the same for VR. And at CES, Google announced the first phone that includes both Tango depth sensing cameras and Daydream virtual reality, two systems that seem fated to merge.

But Microsoft’s partners didn’t let anyone turn on the headsets they showcased, and Oculus sat CES out almost entirely. Google emphasized that its augmented and virtual reality elements “do not overlap or integrate” in current phones. The fact that these companies didn’t demo inside-out tracking last week could suggest that it’s still not ready for the public. And doing VR tracking badly can be worse than not doing it at all.

Perhaps the most ambitious and best-performing new headset at CES was Intel’s third-generation Project Alloy, which is supposed to mix real-world elements with virtual ones. We’ve seen some creative proofs of concept on Alloy, going far beyond ordinary VR. At CES, its RealSense cameras also came close to matching the Rift or Vive. The whole thing is wireless, and setup just requires scanning the room with Alloy’s camera beforehand. But in exchange for Alloy’s free movement and lack of unwieldy wires, I had to give up my hands. Instead of fully tracked motion controllers like the Rift or Vive’s, I got a remote that rotated in place, imitating a gun. I could shoot fine, but it destroyed the illusion of having virtual arms. The headset could theoretically do full motion tracking, but only if my hands were in view of the cameras.

uSens inside-out tracking at CES
uSens inside-out tracking at CES

Where Intel showed off a wireless high-end VR alternative, Qualcomm was taking VR to the lower end, with mixed results. My first experience with its phone-based prototype was exciting, but fairly bad. I could move around a simple virtual rendering of Zordon’s cave from Power Rangers, but when I leaned forward, the world just spun around me, as if it wasn’t sure what I was trying to do. I’d take a few steps, and the cave would slide to match my position. Nothing was jittery or blurry in the demo, but the world didn’t feel real either, especially compared to the sophisticated Vive experiments I’d been playing at HTC’s booth a few hours earlier.

I lost my hand to a wedding ring

Qualcomm later said that the problem was low lighting conditions, and it worked better in a show floor demo, not the cocktail party where I initially tried it. But it still had trouble picking up on small movements, which are important for preventing nausea or playing games. The same thing happened with a headset from Pico — the hardware looked great, but its demo made me dodge lasers and spike traps in a virtual body that always felt out of sync with my real one, until I finally gave up, slightly disoriented.

And occasionally, a company aims to solve far more tracking problems than it can handle. uSens, creator of the Impression Pi headset, came to CES with a demo that combined augmented reality, inside-out tracking, and hand controls: you could pick up objects in a totally virtual environment, then use a hand gesture to open a portal to the real world, all while walking around with no external trackers. But movement was ponderous, gestures were awkward, and my left hand kept disappearing, apparently because my wedding ring reflection confounded uSens’ sensors.

Like Eonite, uSens is primarily selling software. “We want something that is a comprehensive solution that works across the different types of devices that people bring home,” says uSens market strategist Raj Rao, hence the combination of features. Even if it’s far from perfect now, Rao says that machine learning will improve the algorithm as more data points come in — including, presumably, my overly glittery hand.

Even if nobody’s created the perfect self-tracking headset, though, it’s one of the most obvious leaps that virtual reality — especially mobile VR — can make this year. The first generation of headsets is already on the shelves, and owners may not want to upgrade just for a higher-resolution screen or slightly better performance. But they might be sold on ditching their external cameras, or getting a mobile device with a major new feature. Meanwhile, people who wrote VR off as either too inconvenient or too limited could give it another look.

If CES is any indication, lots of companies will be making that pitch in the coming months. Right now, they just need the hardware and software to back it up.