Skip to main content

Breaking down Apple’s new augmented reality platform

Breaking down Apple’s new augmented reality platform

/

A stepping stone, but a pretty cool one

Share this story

Apple finally announced an augmented reality platform for developers yesterday, and it’s about time. ARKit, as the tool is called, lets app makers draw on detailed camera and sensor data to map digital objects into 3D space. This lets them move beyond simple 2D camera overlays, without requiring the heavy-duty software engineering behind more advanced tools like Snapchat world lenses.

If nothing else, Apple has just made catching pokémon more immersive. But ARKit could also let the company compete against Google, which currently sets the gold standard for phone-based augmented reality. It could even set the stage for augmented reality glasses and virtual reality. Here’s what that means.

Apple “world tracking” is a strike at Google and Facebook

ARKit enables what Apple refers to as “world tracking,” which works through a technique called visual-inertial odometry. Using the iPhone or iPad’s camera and motion sensors, ARKit finds a bunch of points in the environment, then tracks them as you move the phone. It doesn’t create a 3D model of a space, but it can “pin” objects to one point, realistically changing the scale and perspective. It can also find flat surfaces, which is great for putting digital props on a floor or table.

Apple imagines people using this tech in ways that make it sound a lot like Google’s Tango platform. It referenced an upcoming app from Ikea, for example, which seems very likely to involve putting virtual furniture in your house — something Wayfair has been doing on Tango for a long time. But ARKit has a huge advantage over Tango: it’s going to be available on a giant swathe of existing devices, while Google needs each Android manufacturer to build Tango hardware into their phones.

Facebook is also pushing augmented reality hard, touting some advanced machine learning. But developers are limited to working with Facebook’s own Camera app, while Apple will let them add augmented reality to independent iOS apps. With ARKit, Apple says it’s got “the largest AR platform in the world,” and that may well be true.

ARKit is still a limited platform

On the other hand, ARKit doesn’t work exactly like Tango. Tango has extra cameras to pick up wide-angle images and depth data, and over the past couple of years, it’s developed very precise tracking capabilities. You can do things like scan an entire room and instantly build a 3D model of it, which requires a separate peripheral on iOS. You can also use Tango with a minimum of frustration, because it’s good at keeping objects in place.

ARKit could improve individual elements of existing AR apps; Apple promises excellent object scaling, for example. Apple says ARKit also uses a fraction of the phone’s CPU, so it could make AR less of a resource drain. But it doesn’t transform the iPhone or iPad’s basic tracking capabilities. It just makes them available to more developers, who will no longer have to build their own tracking and imaging systems.

Apple also doesn’t seem as interested as Facebook and Google in hooking this all up to internet search by default. Developers can use Apple’s new machine learning framework to identify objects in a scene, but Apple’s not talking about identifying wine vintages the way Facebook did at F8, or letting Siri analyze concert posters and auto-translate signs the way Google did with Assistant at I/O.

ARKit is a stepping stone

As my colleague Lauren Goode mentioned, Apple is supposedly developing augmented reality glasses, and it needs apps to make that work. ARKit gets iOS developers thinking about AR, so hopefully somebody will come up with a great use case to help Apple’s hardware avoid the ignominious fate of Google Glass.

But ARKit is also a stepping stone in a smaller sense. As a general iOS 11 feature, it’s straddling the gap between Apple’s single-camera devices and its dual-camera iPhone 7 Plus. These cameras offer better depth sensing, and if they become a standard iOS feature, ARKit could start feeling a lot closer to Tango.

It’s also a gateway to VR

If a phone can track someone walking around a virtual object, you could theoretically pop that phone in a VR headset and let them walk around a virtual environment. That’s why Google used Tango technology for its all-in-one VR headset. Lots of people have speculated that Apple is planning an iPhone-based headset, particularly since it may be switching to VR-friendly OLED screens. And now that Apple is officially in the augmented reality game, it’s less of a leap to full VR.

That said, there’s a much higher bar for virtual reality tracking: you won’t literally get sick if an AR object slips out of place, which is a real concern in VR.

So what will people do with ARKit right now?

For all the excitement over augmented reality, it’s worth remembering that no one has launched a major app with AR as the main selling point. Pokémon Go, touted as one of AR’s big successes, featured the technology as a minor aesthetic perk; we even recommended people turn it off to play the game better. Snapchat augmented reality took off because people were already using Snapchat, and if AR filters succeed on Facebook, it’ll likely be for the same reason.

Making augmented reality more accessible could change this, freeing developers to build AR-first apps for a huge audience with low overhead. Even if these don’t break through, augmented reality’s “killer app” could also be the slow creep of AR into apps people are already using — think ubiquitous filters in video services, or an optional AR mode in games. Neither of these futures would be as exciting as augmented reality glasses, but it would help advance a technology that’s been on the cusp of arriving for years — with devices people already use.