Skip to main content

Apple shouldn’t wait any longer to show the world that it’s serious about AR

Apple shouldn’t wait any longer to show the world that it’s serious about AR

/

Give developers the tools they need this WWDC

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Pokemon Go
Photo by James Bareham / The Verge

By now, most smartphone users have experienced some form of augmented reality on their phones, whether they play Pokémon Go, use Snapchat, or actually remember the days of Yelp Monocle (from way back in 2009). But for the people who make the apps, AR isn’t quite as simple, particularly when the apps they’re building involve 3D or “volumetric” objects — stuff you can walk around and interact with through the screen of the phone.

Which is why Apple should start making that process a whole lot easier — especially if it is planning to release AR glasses in the future, ones that will need some sort of “killer app” or group of apps to justify their existence. Other companies, like Facebook, have now taken steps to let developers build AR experiences into their in-app cameras. You might not normally think of Facebook and Apple as direct competitors, but by doing this, Facebook has essentially set the stage for what is going to become an AR platform war on the phone. And Google already has its own AR platform, Tango. Amid the iOS, macOS, and potential Siri speaker reveals at WWDC next week, Apple needs to give due attention to AR.

Augmented reality refers to any kind of computer-generated image that appears on top of a view of the real world. For the sake of this argument, however, let’s focus on smartphones. There are 2D AR apps: ones that do a pretty basic job of slapping a flat, non-dynamic digital image over the stuff you’re seeing through your camera lens. Developers say this is the easiest kind of AR mobile app to build.

True 3D AR apps on mobile require sophisticated engineering right now

Then there are 3D AR apps, which usually require more advanced software development and image processing. Think: Snapchat’s filters like the digital flower crown hovering over your head as you move around in real time, or the virtual furniture you can examine and walk around in home-renovation app Houzz. These may seem like lightweight (or lighthearted) applications, but Snapchat’s lenses require a serious amount of engineering, and the Houzz 3D AR feature took months to build. Another home-renovation platform that offers 3D imaging, Modsy, does most of its image processing after you’ve taken smartphone photos, which can take anywhere from five minutes to several hours. Afterward, it sends the rendered 3D images back to you.

There’s another way to experience “true 3D” AR apps on a smartphone, and it’s a combo hardware-and-software solution. Google’s augmented reality computing platform, Tango, runs on smartphones that have a specific set of camera, infrared, and depth sensors. But right now there’s exactly one Tango phone model on the market. And some of the Tango apps are slow and low resolution.  

Since we’re not expecting any sort of hardware update to the iPhone lineup for the next four months at least — no new cameras or depth sensors that would enable 3D AR — that means any Apple advancements around AR in the near term will come from the software side. Some have speculated that Apple could announce “ARkit,” a kind of framework that would integrate parts of the existing camera with new software in a way that would allow developers to gather more information from the camera.

Given that Apple chief executive Tim Cook has spoken so enthusiastically about AR in recent months, saying it’s a “big idea, like the smartphone, this kind of framework makes a lot of sense. Currently, Apple doesn’t offer a specific API for virtual reality or augmented reality. There’s something called SceneKit, a kind of 3D-rendering engine for games, but no API for real-time object tracking. One developer, who spoke on background because of a pending AR app launch, told me that the one thing he has been hoping for is the ability to “hook into” the iPhone’s camera sensors and directly access the hardware in a way that developers currently can’t.

Plus, if Apple really does plan to introduce AR glasses at some point in the future (something that has been rumored, but not confirmed), it will need a healthy ecosystem of app developers who know how to make or have already embraced augmented reality apps. Releasing a suite of developer tools for AR on the phone will presumably make it easier to move those apps over to future AR hardware products Apple could make.

If Apple does eventually launch AR glasses, it’s going to need killer apps to prove that they’re useful, not awkward

The last major pair of AR glasses to hit the market was the ill-fated Google Glass. Google Glass was awkward, made people uncomfortable in public, and even spawned the term “Glassholes.” But all of that may have been forgiven had the actual experience of having contextual information put right in front of your eyes been awesome. In other words: AR glasses are going to need apps so good and so useful that they justify whatever social awkwardness wearing a computer on your face creates.

Will Apple be able to get major app developers to build AR apps for glasses, if and when it gets into glasses? Of course. But there’s also the very real possibility that the breakout app or surprise app for AR glasses would come from a less-established developer, one who wouldn’t have the resources to invest in true 3D AR if it’s as technically complicated as it is now.

Apple has had a long-standing tradition of waiting to jump into a market because of the premise that it will do it — whatever it may be — better. But the company shouldn’t wait any longer to show the world that it’s serious about AR as a platform.