Yesterday saw the announcement of a flurry of Apple accessibility improvements, including a new “Assistive Access” interface option, Live Speech, and Personal Voice. These have been welcomed by people who rely on accessibility features, though they would like to see greater ambition for one of the features.
The Personal Voice feature was the one to get the most attention, needing just 15 minutes of training in order to allow users to have their iPhone play speech in their own voice …
Steven Scott is blind and the creator and host of Double Tap, a daily radio show all about how technology can be used by blind people. The show airs every day on AMI-audio across Canada at 12 noon Eastern, and is also available as a podcast.
He said that it can sometimes be hard for sighted people to understand just how revolutionary simple-sounding tech can be.
Having the ability to identify all the different buttons on my microwave seems pretty unremarkable to most people, but often when you’re blind you are shown what you need to know and nothing more, so you might not even know that your microwave has a defrost function. With something like the new Point and Speak feature, I can find out for myself what functions my microwave and other kitchen appliances have.
But he says that some accessibility features can also benefit everyone.
The beauty of accessibility features is that they often help so many more people than often intended. Assistive tech will be so helpful to many who just want a more simplified experience of using their device.
This year I’m looking forward to the much-rumoured Reality Pro and beyond. Apple is known for its commitment to accessibility across its product lines so I will be fascinated to see how they implement it into a whole new category.
Colin Hughes, former BBC producer and advocate for accessible technology, said that as a quadriplegic person, hands-free control of technology is key.
As someone living with a severe physical disability, who relies on voice to get things done, the Apple accessibility features that make the biggest difference to my life are Voice Control, and all the hands-free features Siri offers.
I am pleased to see Apple enhance Voice Control by adding phonetic suggestions for text editing so users can choose the right word out of several that might sound alike, like “do,” “due,” and “dew.”
Additionally, Voice Control Guide, a Windows-like Voice Access feature, where users can learn tips and tricks about using voice commands will be helpful, especially for newcomers to Voice Control.
He has previously shared with us how Apple tech helps him, with a day-in-the-life video showing how HomeKit in particular provides a huge amount of independence.
Hughes also takes comfort from the announcement of Personal Voice, even though he doesn’t yet need it himself.
For people with progressive disabilities like mine, it’s encouraging that new features like Live Speech and Personal Voice will be there one day when we might need them. It’s reassuring to see Apple acting so inclusively.
He does, though, think Apple needs to set more ambitious goals for Voice Control.
The new text editing feature sounds great, but does feel a little like locking the stable door after the horse has bolted. The company should be doubling-down on accuracy by using AI so that less editing is needed.
Apple should also be using AI and voice isolating microphone technology to block out background noise when dictating with a Mac or an iPhone in noisier environments. Voice Control dictation is nowhere near the 98 – 99 per cent accuracy users achieve with an app like Dragon Professional, which unfortunately is now only available on Windows computers.
Voice Control still struggles with proper nouns, and foreign names. Even if you add proper nouns to Voice Control’s vocabulary the app ignores the capitalisation of the name.
Again, more reliable dictation features benefit everyone, not just those with disabilities.
I would love to hear more from the company on how it is trying to improve dictation accuracy for everyone.
Hughes also has a suggested next step for Voice Control.
I would like to see Apple do more for the estimated 250,000,000 people who have non-standard speech and experience difficulty making their words understood. At present you can’t train the app to recognise words the way you pronounce them, so I’d love to see an element of personalised speech recognition in Voice Control in the future.
With greater accuracy and personalised speech recognition, Apple will be able to help even more people make themselves heard.
Apple has often said that it wants to make its products useful to as many people as possible, and that it doesn’t seek a financial return on its investment in accessibility tech.
If you have a disability, please share your own thoughts on yesterday’s announcement of Apple accessibility improvements – and if you don’t, do you see any of them as having broader appeal?
FTC: We use income earning auto affiliate links. More.
Comments