The iOS 18 accessibility features yesterday previewed by Apple have been hailed as “life-changing” by a disability campaigner.
Accessibility advocate Colin Hughes praised Apple for responding to a key request he made via 9to5Mac back in March …
Call for personalised speech recognition
Hughes told us earlier this year that many disabilities effect clarity of speech, meaning it can be hard for those conditions like cerebral palsy, amyotrophic lateral sclerosis (ALS), and muscular dystrophy to use voice commands via Siri and Voice Control. The same is true for those recovering from a stroke.
This creates the unfortunate situation of those who have the greatest need for voice commands to face the greatest difficulty in having them be understood by Apple devices.
He proposed that Apple offer a personalised speech recognition feature.
One of the key ways that AI can enhance Voice Control is through personalised speech recognition. This feature can help people who have non-standard speech, which means they may face challenges in making themselves clear. AI can learn to recognise and transcribe their speech more accurately and naturally, regardless of the factors that affect their speech, such as weak voices, speech impediments, breathing difficulties, or muscle disorders. Google estimates that 250 million people have non-standard speech.
Apple is now offering exactly this
Yesterday’s iOS 18 accessibility announcements by Apple included this exact feature.
With Vocal Shortcuts, iPhone and iPad users can assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks.
Listen for Atypical Speech, another new feature, gives users an option for enhancing speech recognition for a wider range of speech. Listen for Atypical Speech uses on-device machine learning to recognize user speech patterns.
Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customization and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.
Hughes said that the changes were likely to prove life-changing for people with these types of conditions.
I’m one of millions of people who most need the benefits of voice control, but my atypical speech can sometimes make it hard to make myself understood by Siri and Voice Control.
The set of voice accessibility enhancements announced yesterday are the biggest breakthroughs I have seen to date, and I’m really grateful to Apple for listening.
These changes will literally change my life, and the lives of others who have faced the same challenges.
Enhanced Voice Control also key
Hughes also warmly welcomed Enhanced Voice Control, saying that he’d long advocated for this. It can be especially challenging to have Apple devices understand unusual terms – such as equipment which might be needed by people with disabilities – so support for complex words and custom vocabulary is huge.
This enhancement is a game-changer for those whose speech patterns include unique or complex terminology.
I’m cracking open the champagne, thrilled that Apple has listened and taken significant steps to make voice recognition technology accessible to all.
- See how Apple Accessibility transforms lives, working hand-in-hand with HomeKit
- Apple tech accepted as social care expenses for disabled man after 9to5Mac video
Image: 9to5Mac
FTC: We use income earning auto affiliate links. More.
Comments