Apple has announced a new feature called Live Text, which will digitize the text in all your photos. This unlocks a slew of handy functions, from turning handwritten notes into emails and messages to searching your camera roll for receipts or recipes you’ve photographed.
This is certainly not a new feature for smartphones, and we’ve seen companies like Samsung and Google offer similar tools in the past. But Apple’s implementation does look typically smooth. With Live Text, for example, you can tap on the text in any photo in your camera roll or viewfinder and immediately take action from it. You can copy and paste that text, search for it on the web, or — if it’s a phone number — call that number.
Apple says the feature is enabled using “deep neural networks” and “on-device intelligence,” with the latter being the company’s preferred phrasing for machine learning. (It stresses Apple’s privacy-heavy approach to AI, which focuses on processing data on-device rather than sending it to the cloud.)
Live Text works across iPhones, iPads, and Mac computers and supports seven languages: English, Chinese (both simplified and traditional), French, Italian, German, Spanish, and Portuguese. It also integrates with Apple’s Spotlight search feature on iOS, allowing you to search your camera roll based on the text in images.
In addition to extracting text from photos, iOS 15 will also allow users to search visually — a feature that sounds exactly the same as Google Lens and that Apple calls Visual Look Up.
Apple didn’t go into much detail about this feature during its presentation at WWDC, but it said the new tool would recognize “art, books, nature, pets, and landmarks” in photos. We’ll have to test it out in person to see exactly how well it performs, but it sounds like Apple is doing much more to apply AI to users’ photos and make that information useful.