On Sunday, the annual music issue of The New York Times Magazine will feature additional content available via Google Lens, giving print readers a look at multimedia features that are usually reserved for online readers. The on-demand object recognition tool Google first introduced in 2017 lets users point their phones at an object to receive additional information.
By using Lens, readers of the magazine can access immersive video and animation and listen to a playlist of music in the issue. The print magazine’s music issue has three varieties of its cover, each with a different artist, including Billie Eilish, Lil Nas X, and Megan Thee Stallion.
Readers will also be able to preview the Times’ “25 Songs That Matter Right Now,” with a short essay for each song on the list. Other features available from the print magazine via Lens include links to podcasts and the ability to save articles to the Times’ app and explore the design process for every cover through the “Behind the Cover” video series. The Lens version of the music issue also will feature interactive print ads.
“The Magazine’s annual Music Issue is always a reader favorite, and each year we try and stretch the bounds of what we do with it,” Jake Silverstein, editor-in-chief of The New York Times Magazine, said in a statement. “We know that many print readers already read with their phones in hand or nearby, so this is a logical next step.”
Google unveiled Lens at I/O 2017 as an app and later integrated it into Android’s standard camera app. It’s also available on iOS through the Google app and Google Photos. In addition to identifying and providing additional context for objects and images, Lens can be used to scan QR codes, copy and save text, and provide language translation. Google Maps now uses Lens to scan restaurant menus to identify popular dishes.