Apple yesterday shared details of how face recognition works on the iPhone X, but one thing that hasn’t gotten the attention it deserves is how Apple future-proofed its iOS authentication from the start.
This future-proofing is why all Touch ID enabled apps will automatically authenticate with Face ID without requiring developers to update …
When a developer wants an app user to authenticate, they don’t get involved in the nitty-gritty of how that authentication is performed. They just use code that asks iOS to do it for them – what Apple calls the Local Authentication framework.
That code is agnostic about the method used for authentication. The app developer neither knows nor cares whether Touch ID or Face ID was used for biometric verification, which is why any app that supports Touch ID will automatically support Face ID on the iPhone X.
Apple does allow developers the freedom to disallow biometric authentication altogether, and require a passcode at all times. That could be done with an enterprise app, for example, if a company has a policy of not allowing biometric authentication. But if they do allow it, it doesn’t matter what method is used.
As an aside, the way Apple Pay works with Face ID sounds, on the, ah, face of it, to be rather clunky.
To authorize an in-store payment with Face ID, you must first confirm intent to pay by double-clicking the side button. You then authenticate using Face ID before placing your iPhone X near the contactless payment reader.
That sounds on paper like three separate steps: double-click the button to initiate, then authenticate, then hold the phone out to the payment terminal.
In practice, though, I suspect it will be a far smoother experience, thanks to the speed of Face ID. You double-click, Face ID authenticates before you even know it, and you then hold out the phone. The real-life experience will, then, be exactly the same as for the Apple Watch today: double-click, hold device to terminal.
One other interesting thing on the 3D camera and apps: developers get access to the face-scanning info. They don’t get direct access to Face ID (only the standard yes/no response), and can’t write their own 3D face-recognition code as they simply cannot access the high-fidelity depth data.
Apple does give developers access to 3D positional information of facial features, which enables the creation of Animoji-style applications. It will be interesting to see what they do with it.
Benjamin Mayo contributed to this piece
Check out 9to5Mac on YouTube for more Apple news:
FTC: We use income earning auto affiliate links. More.
Comments