Soon, you can control the iPhone with your eyes
Global Accessibility Awareness Day (GAAD) is on the third Thursday of May each year. This is also the day Apple announces a slew of inclusive features for users with physical disabilities and impediments. This year is no exception with the tech giant introducing capabilities that will work on the iPhone and iPad with the arrival of iOS 18 later this year. For many people, such features can be life-changing, allowing them access to the use of technology that they couldn’t have until then.
Accessibility features can be very interesting and useful to all users to make a device more usable. “We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”
“Each year, we break new ground when it comes to accessibility,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”
Eye control
Navigating a device may be second nature to general users, but for those who have a disability, seemingly minor and easy tasks can be a challenge. The Eye Tracking capability, powered by Artificial Intelligence, will give users a built-in option for navigating iPad and iPhone with just their eyes. It makes use of the front-facing camera to set up and calibrate in seconds and work with on-device machine learning. All the data used to set up and control this feature is kept secure on device, and isn’t shared with Apple.
Eye Tracking works across iPadOS and iOS apps and doesn’t require additional hardware or accessories. With this, users can navigate through the elements of an app to activate each required and accessing functions such as physical buttons, swipes, and other gestures with the eyes.
Vocal Shortcuts
Vocal Shortcuts is a feature that iPhone and iPad users can use to assign custom vocalisations that Siri can understand to launch shortcuts and complete complex tasks. Listen for Atypical Speech, another new feature, gives users an option for enhancing speech recognition for a wider range of voice patterns. Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customisation and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.
Feel the music
The capability of Music Haptics is a new way for users who are deaf or hard of hearing to experience music on iPhone. With this, the Taptic Engine on the iPhone plays taps, textures, and refined vibrations to the audio of the music. Music Haptics works across millions of songs in the Apple Music catalogue and will be available as an API for developers to make music more accessible in their apps.
Vehicle Motion Cues will be a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles. Research shows that motion sickness is commonly caused by a sensory conflict between what a person sees and what they feel, which can prevent some users from comfortably using an iPhone or iPad while riding in a moving vehicle. With Vehicle Motion Cues, animated dots on the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content. Using sensors built into iPhone and iPad, Vehicle Motion Cues recognizes when a user is in a moving vehicle and responds accordingly. The feature can be set to show automatically on the iPhone or can be turned on and off in the Control Centre.
Accessibility features are also coming to CarPlay and these include Voice Control with which users can control apps and navigate CarPlay with their voice. With Sound Recognition, drivers or passengers who are deaf or hard of hearing can turn on alerts to be notified of car horns and sirens, although their eventual use in vehicles would depend on the laws in a particular country. For users who are colourblind, Colour Filters make the CarPlay interface visually easier to use, with additional visual accessibility features including Bold Text and Large Text.
Apple’s virtual reality device will get accessibility features on visionOS. This will include systemwide Live Captions to help everyone — including users who are deaf or hard of hearing — follow along with spoken dialogue in live conversations and in audio from apps.