News > Smart & Connected Life Apple Announces Plans to Add AI Capabilities to 3 Cognitive Accessibility Features Hear texts in your own voice, read off what you're pointing at, and more By Rob Rich Rob Rich Twitter News Reporter College for Creative Studies Rob is a freelance tech reporter with experience writing for a variety of outlets, including IGN, Unwinnable, 148Apps, Gamezebo, Pocket Gamer, Fanbolt, Zam, and more. lifewire's editorial guidelines Published on May 16, 2023 11:27AM EDT Fact checked by Jerri Ledford Fact checked by Jerri Ledford Western Kentucky University Gulf Coast Community College Jerri L. Ledford has been writing, editing, and fact-checking tech stories since 1994. Her work has appeared in Computerworld, PC Magazine, Information Today, and many others. lifewire's fact checking process Tweet Share Email Tweet Share Email Smart & Connected Life AI & Everyday Life News Apple is working on several new AI- enhanced accessibility features to assist users with visual, mobile, vocal, and cognitive impairments. Several new accessibility options and updates are on the way to the iPhone and iPad, with Apple drawing on hardware, software, and machine learning to make it happen. "These groundbreaking features were designed with feedback from members of disability communities every step of the way," said Apple senior director of Global Accessibility Policy and Initiatives, Sarah Herrlinger, in the announcement, "to support a diverse set of users and help people connect in new ways." Assistive Access—developed in collaboration with people experiencing cognitive disabilities and their supporters—streamlines the interfaces of some apps to "lighten the cognitive load." It also combines similar apps (like Phone and FaceTime) into one while also allowing for a number of settings to cater more to a specific user's needs. These include an emoji-only keyboard for those who prefer more visual-based communication, the option to organize the Home Screen into rows (for those who prefer text), and so on. Live Speech is designed to support non-verbal users (or users who have lost their ability to speak over time) by offering pre-recorded response options and text-to-speech during calls. And a Personal Voice tool can record 15 minutes of audio (suggested text read by the user), then use that data to generate speech in the user's voice. Apple Users with impaired vision will also be able to use the new Point and Speak feature for the Magnifier app to make discerning text labels on various items easier. While aiming the camera at the text (a microwave keypad, cereal box nutrition information, etc.), users can physically point a finger at what they want to read. The iPhone (or iPad) will audibly read the text in the indicated area All of Apple's new AI-enhanced cognitive accessibility features are expected to roll out on iPhones and iPads "later this year," though there's no specific launch window at this time. iOS and iPadOS version requirements also remain a mystery for now. Was this page helpful? Thanks for letting us know! Get the Latest Tech News Delivered Every Day Subscribe Tell us why! Other Not enough details Hard to understand Submit