Opinion News > Software & Apps iOS 15’s Live Text Lets You Look Up the World Around You Search and edit anything By Charlie Sorrel Charlie Sorrel Senior Tech Reporter Charlie Sorrel has been writing about technology, and its effects on society and the planet, for 13 years. lifewire's editorial guidelines Updated on June 11, 2021 11:34AM EDT Fact checked by Rich Scherr Fact checked by Rich Scherr Twitter University of Maryland Baltimore County Rich Scherr is a seasoned technology and financial journalist who spent nearly two decades as the editor of Potomac and Bay Area Tech Wire. lifewire's fact checking process Tweet Share Email Tweet Share Email Software & Apps Mobile Phones Internet & Security Computers & Tablets Smart Life Tech Leaders Home Theater & Entertainment Software & Apps Social Media Streaming Gaming Women in Gaming Key Takeaways Live Text recognizes words in photos and images, and turns them into regular text.iOS 15 and macOS Monterey bake Live Text in at the deepest level.There’s now no difference between text and images of text. Apple Live Text turns text in photos, screenshots, or even from your live camera into searchable, copyable, editable text. In iOS 15, iPadOS 15, and macOS Monterey, Live Text is everywhere: in screenshots, in your Photos library, and even in text-input areas. What it does is recognize words in any image, then turn them into regular text. From there, you can select it, copy it, share it, look it up, and even translate it. Live Text also brings Apple’s “data detectors” to the party, so you can tap a telephone number in a photo of a store sign, for example, then call it with the phone app. If you’ve ever found yourself tapping a word in a paperback to look up its meaning, long pressing a link printed in a magazine to see a web preview, or tapping a place’s name to see it on a map, then you’re going to love Live Text. It makes the real world searchable, editable, and more usable. It Just Works In the new Mac, iPad, and iPhone operating systems, Live Text is just there. There’s no special mode. Apple has added live text anywhere it makes sense. There’s a new button every time you take a screenshot, which lets you highlight all text in the image, and that’s the most complex it ever gets. It makes the real world searchable, editable, and more usable. In most cases, the text in a photo is simply text. Say you took a photo of a product label in a store to remember to check on it later. If you’re looking at that picture in the Photos app, you just swipe your finger across any text to select it. It’s just that now, all images of text are also text, automatically. From there you can share it, copy it, use the new built-in translation feature, call the number, open a link, see an address on a map, and more. Search Anything The first time you install iOS 15, it will scan and process your photo library, recognizing any text it finds. This has one very powerful implication: if you search for something in Spotlight (the system-wide search tool), then it will include results from text in your photos. For instance, you can find receipts you photographed years ago, just by searching for any text that may be on the receipt. Are you trying to remember where you ate that delicious rice dish on holiday in the Costa Brava? If you photographed the menu, you’ll be able to find it easily. Apple Or how about building a recipe book without even trying? Every time you see a recipe in a magazine or in a recipe book, just take a picture, and you can find it any time. Live Text fundamentally changes how you interact with the world. Suddenly, every word on any piece of paper, any storefront, screenshot, or road sign becomes just as usable as text in a notes app. There are already bits and pieces of this in computing. Google’s Translate app has long been able to translate text through the camera, and iOS has been able to scan and OCR documents in the Notes app for a while. But now that Apple has baked Live Text into its devices, there’s no longer any distinction between kinds of text. It’s all the same. Even those screenshots of text people post on Twitter are now as usable as if they’d done it properly. AR Lite Live Text is another example of Apple diving into augmented reality. We’ve seen how Apple is all in on AR, from the too-long demos in its various keynotes over the years to the neat AR models of new products that let you see how the new iMac would look on your desk. Apple also has added plenty of audio AR features, reading out messages and alerts, or giving you directions via AirPods. Live Text fundamentally changes how you interact with the world. It’s an open secret that this is all practice for Apple’s eventual AR glasses product, and Live Text will likely be a big part of that. Not only will your glasses be able to read signs around you to get better awareness, but they’ll be able to look up information as you read it. But for now, we’re all benefitting from Apple’s AR experiments. Live Text is just fantastic. I’ve only been using it for a couple of days, and it already seems natural. I can’t wait to see what app developers do with it. Was this page helpful? Thanks for letting us know! Get the Latest Tech News Delivered Every Day Subscribe Tell us why! Other Not enough details Hard to understand Submit