Google is bringing its augmented reality system ARCore out of beta with new features, and it’s making its Lens visual search tool part of Google Photos on all phones. Today, ARCore 1.0 launches on all Google Pixel phones, all recent Samsung flagship phones, the Android O version of LG’s V30 and V30 Plus, the Asus ZenFone AR, and the OnePlus 5. In addition to the preview version’s capabilities, ARCore 1.0 includes support for anchoring virtual objects to any textured surface, not just flat, horizontal ones. Google boasts that 100 million Android phones currently support the platform, and it’s working with several companies (Samsung, Huawei, LGE, Motorola, Asus, Xiaomi, HMD / Nokia, ZTE, Sony Mobile, and Vivo) to certify new ARCore phones in the future.
Lens was previously Pixel-only, but now it’s available through Google Photos on Android and iOS 9 or later or through Google Assistant on several Android flagship phones. It’s also supposed to have improved support for recognizing common animal breeds and plant types. Lens can examine photos you’ve already taken through Photos, and if you’re using a Samsung, Huawei, LG, Motorola, Sony, and HMD / Nokia flagship with Assistant, you can simply pull out your phone and point it at something.
Lens makes sense of the world through your phone, and ARCore modifies it
ARCore launched as a limited preview in August 2017, and Lens arrived shortly after in October. Google Lens senior director of product Aparna Chennapragada describes them as two sides of the same coin: Lens is a “camera-in” system that helps you make sense of the visual world, and ARCore is a “camera-out” that makes the world appear differently through your phone. Lens is an extension of Google’s text and voice search capabilities, and ARCore lets developers easily build augmented reality apps, similar to Apple’s ARKit for iOS. ARKit recently added support for vertical planes and image recognition, and Google’s own update could help raise the bar for both mobile AR platforms.
We’ve seen things like ARCore stickers from Google already, but as of today, developers can upload their own ARCore-based apps to the Play Store. Apps that already feature augmented reality capabilities can also integrate ARCore tech for better performance: Snap, for instance, is supplementing its “world lens” feature with ARCore, introducing a new experience that simulates entering Barcelona’s Camp Nou stadium through a high-tech portal. Snap has put a lot of work into augmented reality, but “they don’t go out and certify and calibrate millions and millions of cameras,” says Amit Singh, VP of business and operations for Google VR. (Snap says that it does, however, certify that every Snapchat experience will perform optimally on every compatible device.) Google is also updating Android Studio to let developers preview AR apps on the desktop.
In addition to Snap’s experience, Google is partnering with a few other developers to celebrate ARCore 1.0’s launch. Sotheby’s International Realty, furniture company Otto, and e-commerce company JD.com will let you preview rooms, furniture, appliances, and other goods. Porsche will let Android users check out a version of its Mission E concept vehicle, and an upcoming mobile game called Ghostbusters World will (naturally) let players trap ghosts that appear in the real world. Google is also taking its platform outside the Play Store ecosystem in China, partnering with Xiaomi and Huawei to distribute ARCore-powered apps through independent app stores.
The long-term vision for ARCore and Lens is pretty exciting. As Chennapragada points out, Google could easily connect Android’s “camera-in” and “camera-out” functions. She offered the example of seeing a nice piece of furniture at a friend’s house, taking a picture for Google to identify, and automatically calling up a 3D model of it to preview back at home. “There’s a reason why we’re talking about these two things together,” she says. Lens-style visual search could also expand beyond phones to something like Google’s VR180 point-and-shoot video camera line, where it could seamlessly identify or annotate objects. “We haven’t figured out what the user experience [is like], and you don’t want to add more cognitive load to the experience,” she says. “But certainly behind the scenes, I think that’s one of the things we’re looking at.”
For now, though, both Lens and ARCore are relatively simple and limited. Even so, the next few weeks should see them become more sophisticated and more widely available.
Update 12:55PM ET: Added comment from Snap about camera certification.