iOS 17 has solved one of the biggest accessibility hurdles for iPhone users with limited movement: the dreaded ‘You need to unlock your iPhone first’ message.
Siri is of course a massive accessibility win for people like 9to5Mac friend and quadriplegic Colin Hughes, allowing voice control of a wide range of tasks. But until now, there’s been a major hurdle …
Hughes talked last year about a big accessibility issue: unlocking the iPhone.
One frustrating user experience is the “You need to unlock your iPhone first” response to commands like “read my messages,” “what’s next on my calendar,” etc. Until now, if my iPhone was locked, and in my wheelchair side pocket for example, I couldn’t take it out and unlock it to access my messages – and Siri would always unhelpfully say “You need to unlock your iPhone first.”
Voice Control does let users speak their passcode, but this obviously isn’t secure when around other people.
One idea he had then was for the iPhone to recognize his voice, in the same way HomePod does.
Given that Siri on HomePod can now recognize individual voices, being able to unlock your iPhone by simply saying, “Hey Siri, unlock my phone” and having it check for a voice match before doing so would be perfect.
But in iOS 17, Apple has solved the problem another way.
There’s a new Siri authentication system on AirPods where if your device is unlocked while wearing your AirPods, it will be considered authenticated as long as they are in range (or until you change the output audio device on the system).
So basically when my carer puts my AirPods in my ears, then as long as my iPhone is unlocked when she does so, even if it locks afterwards I can still access my messages, calendar events and more.
Hughes says this is a great solution.
I imagine Apple being Apple they would’ve worried about privacy and security. I’ve been using it, and this really does Just Work.
Chained Siri requests – without the need to do the “Hey Siri” thing repeatedly – also make a big difference to those who make extensive use of voice commands.
Siri knows when you are speaking to it and when you might be talking to someone else. This makes my life easier as well.
You can now do some more advanced things, like being able to speak over Siri at any time to issue a new request. This comes over to Announce Notifications too, which I rely on a lot, so now you can chain requests like “repeat” and then “reply.”
With Siri’s understanding of when you’re talking to Siri versus someone else, you can now issue any request that Siri understands in Announce Notifications in iOS 17, and not just the limited set of functionality there was before.
Another very useful change is that Siri can read out the contents of a webpage, and continue doing so even if the iPhone is locked after making the request.
Are these things you’d find helpful? Or have you noticed any other accessibility improvements in iOS 17? Please let us know in the comments.
FTC: We use income earning auto affiliate links. More.
Comments