Let iPhone speak for you, in your own voice
As we go about our daily lives taking the features available on those constant companions, our smartphones, for granted, we often don't think twice about what it would be like without them. And yet, to those who have to deal with physical and intellectual challenges, a mere feature on a smartphone can be a life changer.
On Global Accessibility Awareness Day, it may be a good time to pause and take a look at some of the new inclusivity features that Apple has announced. These can make a huge difference to a person with disabilities to do something that was seemingly impossible all this time.
Point and Speak
Hardly anyone thinks of the magnifier available on a smartphone as an enabling feature. For those with low vision problems however, the magnifier can make the difference between being able to do something independently, or waiting for help. The humble magnifier, at one time, wasn't even available on a phone and one had to look for an app to do the job. On Android this was often at the risk of letting a malicious piece of software get on board your phone. Today the magnifier has evolved into a tool that does more than just increase the size of what's on the screen. Coming soon is the ability of Detection Mode in the Magnifier on an iPhone or iPad to 'Point and Speak' which means it identifies text users point toward and reads it out loud to help them interact with physical objects such as household appliances. For example, while using a household appliance — such as a microwave — 'Point and Speak' combines input from the camera, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their finger across the keypad.
'Point and Speak' will be built into the Magnifier app on iPhone and iPad, will work well with VoiceOver, and can be used with other Magnifier features such as People Detection, Door Detection, and Image Descriptions to help users navigate their physical environment. For those who haven't encountered these features, they're to be found in Magnifier through the accessibility settings.
Live Speech
One of the startling marvels of machine learning to have come up in recent times is the ability of a system to take in a sample of voice data and use that to mimic the voice's characteristics on being fed text. Apple is putting this tech to good use in an upcoming feature called Live Speech and Personal Voice which will work on recent iPhone, iPad, and Mac. Users can type what they want to say to have it spoken out loud in a synthesised voice such as Siri during phone and FaceTime calls as well as in-person conversations. Personal voice gives the option of training the software to sound like the user. This will take a mere 15 minutes of reading out sentences presented. Users can also save commonly used phrases to chime in quickly during conversation with family, friends, and colleagues.
Live Speech and Personal voice have been designed to support people who are unable to speak or who have lost their speech over time.
This feature update will draw on advances in hardware and software, include on-device machine learning to ensure user privacy.
Apple has a centralised team working out of Cupertino to develop accessibility solutions. "Accessibility is part of everything we do at Apple," said Sarah Herrlinger, Apple's senior director of Global Accessibility Policy and Initiatives. "These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways."
Another feature, Assistive Access distills apps like Photos, Music, Messenger, Face Time, phone calls and other experiences on the device to their essential features only in order to lighten cognitive load on those with cognitive problems. The assistive feature offers a distinct interface with high contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individual they support. For example, for users who prefer communicating visually, 'Messages' includes an emoji-only keyboard and the option to record a video message to share with loved ones.
"The intellectual and developmental disability community is bursting with creativity, but technology often poses physical, visual, or knowledge barriers for these individuals," says Katy Schmid, senior director of National Program Initiatives at The Arc of the United States. "To have a feature that provides a cognitively accessible experience on iPhone or iPad — that means more open doors to education, employment, safety, and autonomy. It means broadening worlds and expanding potential."
"At the end of the day, the most important thing is being able to communicate with friends and family," said Philip Green, board member and ALS advocate at the Team Gleason nonprofit, who has experienced significant changes to his voice since receiving his ALS diagnosis in 2018. "If you can tell them you love them, in a voice that sounds like you, it makes all the difference in the world — and being able to create your synthetic voice on your iPhone in just 15 minutes is extraordinary."
There is of course the tricky question of whether users will discover and actually use the accessibility features. It has always been said that users ignore some 90% of software features on a device. "One of the things we’ve done over the years is to up-level accessibility features within the devices themselves and not only make sure that accessibility is at the top level of settings but also bring it into the setup process," said Sarah Herrlinger. Apple Stores also hold workshops and events to help users learn to use these special features.