As 9to5Mac first reported back in April, Apple is working on a Tag device – similar to the Tile tracker – that can be attached to any object such as keys or a backpack and tracked in the new Find My app. We now have more to share about both Apple’s item tracker as well as the company’s so-called Apple Glasses project.
According to people familiar with its development and confirmed in screenshots shared by MacRumors, there will be a new “Items” tab in the Find My app which shows a user’s items, just like there are tabs for people and devices.
Users can put their tags in lost mode. When a tag is in lost mode and the system detects it being carried by another person who owns an Apple product, their device will alert the person about the item, optionally showing a message from the owner and directing the person to contact them. This is using the new offline tracking feature Apple introduced at this year’s WWDC, which uses nearby Apple devices to privately send location data about a user’s devices to the cloud.
These tags will pack a lot of technology in a very small package consisting of a white circular tag with an Apple logo. They include Bluetooth LE, NFC, a speaker to help with being located, and run a trimmed-down version of iOS. Pairing with a user’s iCloud account will be done by proximity, just like AirPods. NFC can be used when a tag is located so that a user can scan the tag by tapping it with their phone to get information and help contact the owner.
UPDATE 9/9: The Apple Tag device will also integrate the UWB technology mentioned on the latest Kuo report, which is internally called “Rose”.
Apple is also working on integrating an AR mode that will allow users to search for one of their tags in 3D. A balloon will be shown to direct the user to where their tag is.
Engineers have also been developing support for stereo AR in iOS – as reported by 9to5Mac back in April – but the project may not be the “Apple Glasses” everyone has been talking about. It consists of support for a face-mounted AR experience, which can be compared to Google’s Daydream, and has been in internal testing with support for two Apple devices (codenamed Luck and Franc) and a third-party device, HoloKit.
Stereo AR apps on iPhone work similar to CarPlay, with support for stereo AR declared in the app’s manifest. These apps can run in either “held mode”, which is basically normal AR mode, or “worn mode”, which is when used with one of these external devices. A new system shell – called StarBoard – hosts extensions that support the new AR mode, similar to how WatchKit apps worked in the original Watch.
On the new iPhones to be announced next week, the back super wide angle camera is used to enhance the quality of the AR tracking. There’s also support for stereo AR implemented for some system components such as Maps, the Find My app and AR QuickLook that’s available for web content.
With recent reports that the AR headset project was canceled, it’s unclear when (or if) Apple is going to announce something about this, but the company has definitely been working on stereo AR support for gaming and other applications, which is at quite an advanced state as of iOS 13.
An icon found in one of the software packages used to test Apple’s stereo AR experience. It likely depicts a HoloKit device and its filename mentions “mock”.