Skip to main content

Apple Glasses Accessibility features could be a huge benefit, suggests disability campaigner

Apple Glasses Accessibility features could see the long-rumored devices make a huge difference to the lives of those with disabilities.

The point was made by Colin Hughes, the disability campaigner who recently shared the ways in which Apple tech allows him to live a more independent life despite almost no use of his limbs …

I recently got to visit his London apartment to see for myself his combined Accessibility and HomeKit setup. He said he really values the independence given to him by Apple tech – and by the fact that Apple listens.

It’s been a stellar year for accessibility and Apple technology, particularly for those of us who need voice enhancements. Apple has clearly been listening to what disabled users have been saying, and that’s great to see.

You can watch the day-in-the-life video here.

Apple Glasses Accessibility potential

Hughes believes that augmented reality Apple Glasses – thought to be the Cupertino company’s longer-term goal, following mixed-reality headsets – could be transformational.

One big question he has is whether Apple will include a camera.

Smart AR glasses that are easy and comfortable to wear all day, and have smart assistant and voice control baked in, have great potential for extending access to technology for disabled people.

I have tried somewhat limited Amazon’s Echo Frames, and it has left me wanting a whole lot more from this type of wearable technology.

For a start, AR glasses should make it possible for me to take photographs and video for the first time in my life if Apple is brave enough to see beyond what I recognize are legitimate privacy concerns to include cameras with them. I have reached the age of 57, and I have never been able to snap a photo or video of the world around me, a special occasion, a birthday, a holiday or trip out, Christmas, a visit from a friend – but smart AR glasses with cameras and voice control built-in could make this possible for me for the first time, and other severely disabled and paralyzed people.

He also pointed to some of the accessibility applications we’ve already seen in other devices.

For blind and visually impaired accessibility, there is Enivision AI, using Google Glass to do all the object, text, and face recognition, and so on. The ARX Vision headset is also a player in this area.

Clearly a more normal-looking pair of glasses would be welcomed, enabling these kind of features to be offered without attracting so much attention.

Future Accessibility improvements

Looking at the nearer future, Hughes talked about the Accessibility improvements he’d like to see from future Apple updates.

Smarter replies to messages

Although a Siri instruction lets you reply to messages, it could use more smarts.

My friend Jane sent me a message on WhatsApp recently. I listened to her message as it came in with Announce notifications. I forgot to reply to her for about two minutes after I listened to her message. At that point I said “Hey Siri reply to Jane” and even though I used the word “reply” it defaulted, without telling me, to sending the message I dictated by iMessage/ SMS and not WhatsApp – where Jane’s message two mins earlier originated via WhatsApp. This confused Jane.

Hughes notes that you can instruct Siri to “Reply with WhatsApp,” but says that it should automatically use the same messaging platform – and ideally note which chat apps you use with which contacts.

A readback toggle, when sending messages

iOS offers two toggles that should ensure Siri always reads back messages – but they don’t work reliably in iOS 16. It would be better, he says, to have a single toggle to instruct Siri to never send a message without a readback.

I have Automatically Send Messages set to OFF and Reply without Confirmation OFF and yet when dictating and sending new messages, or replying to messages, what you dictate is rarely read back to you by Siri before the message is sent. In effect, your user preference settings are being ignored.

I have recently discovered there is a buried way of getting your dictated messages read out to you before sending, all you need to say is “read it” before sending, but I preferred the iOS 15 behavior where you were given a choice whether to change the message, or to send it.

Access older messages

The iPhone lets Siri read out incoming messages immediately after they arrive – but not if some time has passed. Siri asks you to unlock the phone first, which is not something that can be done when someone is entirely dependent on voice control. This, and other issues, could be solved with a new unlock method.

iPhone unlock via voice authentication

Voice Control does let users speak their passcode, but this obviously isn’t secure when around other people. Given that Siri on HomePod can now recognize individual voices, Hughes says that being able to unlock your iPhone by simply saying, “Hey Siri, unlock my phone” and having it check for a voice match before doing so would be perfect.

Let keyboard dictation send a message

The new keyboard dictation enhancement in iOS 16 means that, for the first time, iPhone owners can seamlessly switch between typing and dictation. But in this mode, there is no way to instruct Siri to actually send the message!

Control auto-answer by voice

One of the ironies about auto-answer – a feature ideal for people who can’t touch an on-screen button to answer a call – is that until iOS 16 and watchOS 9, there was no way to automatically enable the feature. Hughes successfully campaigned for Apple to introduce voice control of the auto-answer feature but he would now like to see the company go further and intelligently automate it.

He’d love to be able to create a Shortcut to automatically enable auto-answer when his carer puts in his AirPods, or puts his Apple Watch on for him.

Improvements to Voice Dictation on the Mac

Hughes says that while Siri dictation is incredibly helpful, there are still some glitches in Voice Control dictation that can frustrate those who rely on it.

To offer up real-world illustrations:

a) I have a friend called Wojtek, which isn’t in Voice control vocabulary. I have added his name in custom vocabulary with a capital “W” Wojtek. However, when I dictate his name, Voice Control dictation transcribes with a small “w” wojtek.

I frequently communicate with a company called SpeechWare. I have added the company name to custom vocabulary with a capital “S” and “W,” but when I dictate the word it is transcribed as “speechware.”

There are some strange glitches. Both Voice Control dictation and Siri dictation always transcribe sun (the fiery orb) as Sun with a capital S, (like the UK tabloid newspaper), even when you are dictating a sentence about the weather! The verb ”will” is often transcribed as “Will” – as in the man’s name – when the context should make it obvious you mean the verb, for example saying “It will be hot later.” Hughes says these sorts of errors are annoyingly common.

Spelling mode should add entries to the dictionary

The new Voice Control “spelling mode” in macOS Ventura allows you spell out words that dictation doesn’t understand – but it doesn’t remember your preferred spelling of a word, so the same mistakes keep happening.

Microsoft-style Voice Focus mode for dictation

While I personally find that Mac dictation works well with the built-in mics, I have the luxury of being able to dictate only in a quiet environment. When you rely on dictation for everything you write, and thus need to be able to dictate in noisier environments too, Hughes says the microphones work less well – which is why he uses a SpeechWare external one.

In the Surface 9 Pro, it appears that Microsoft may have come up with a solution to this, in the form of Voice Focus mode – which aims to isolate a voice, and discard background sounds. This is something Hughes would love to see Apple do.

Always-on Hey Siri on the Apple Watch

You can set the iPhone to always be listening for Hey Siri, but not so on the Apple Watch. That is, you can choose the setting, but it requires you to twist your wrist to wake the watch first – something many disabled people can’t do.

I’m sure that limited battery life has been the reason for Siri behavior on the Apple Watch, but with the release of the Apple Watch Ultra with its three-day battery life, I would happily accept a reduced 1.5-day battery life if Siri could truly be always listening without any wrist movement required, just as it is on the iPhone.

What potential do you see for Apple Glasses Accessibility features? Are there other Accessibility improvements you’d like to see? Please let us know in the comments.

Apple Glasses concept image: Strahil Hadzhiev

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications