Apple’s Worldwide Developers Conference kicks off on June 3rd. Last week, we shared exclusive details about iOS 13 and macOS 10.15. Today, we are sharing details about new features and APIs for developers that should be announced at the event, according to sources familiar with the development of Apple’s new operating systems.
New Siri intents
There will be new Siri intents developers can adopt, including media playback, search, voice calling, event ticketing, message attachment, train trip, flight, airport gate and seat information.
Marzipan improvements
Developers porting their iOS apps to the Mac will have access to new APIs that allow their UIKit apps to integrate with Mac-specific features such as the Touch Bar and menu bar (including keyboard shortcuts). UIKit apps on the Mac will also be able to open multiple windows.
Split View apps ported from iOS will have the ability to be resized by dragging the divider and its position reset by double clicking the divider, just like native Mac apps.
Enabling Mac support for an existing iOS app is as easy as checking a checkbox in the target settings in Xcode, much as you would to add iPad support to an iPhone-only app.
Augmented reality
AR on Apple’s platforms will gain significant improvements this year, including a brand new Swift-only framework for AR and a companion app that lets developers create AR experiences visually. ARKit gets the ability to detect human poses. For game developers, the OS will support controllers with touch pads and stereo AR headsets.
Taptic Engine, links, NFC, more
A new framework will give developers more control over the Taptic Engine, which currently offers a very small set of feedback styles for third-party developers. There’s also new functionality for developers to include link previews in their apps, similar to those that appear in iMessage conversations.
NFC is getting major improvements, including the ability for third-party developers to read any ISO7816, FeliCa or MiFare tags. Currently, only tags formatted as NDEF can be read by third-party apps.
With a new version of CoreML, developers will be able to update their machine learning models on-device. Currently, models have to be pre-trained and are static after deployment. This will allow apps to change their behavior as their ML models learn from user actions. Apple is also adding a new API for developers to do sound analysis with machine learning. The Vision framework will get a built-in image classifier, without the need for developers to embed a machine learning model to classify images into common categories.
The document scanning functionality that’s available in some parts of iOS such as the Notes app will be available for third-party developers with a new public framework. With a new API, apps will be able to capture photos from external devices such as cameras and SD cards, without having to go through the Photos app.
On the Mac, apps will be able to offer file provider extensions, improving the way certain apps such as Dropbox can integrate with Finder. There will also be a new API developers can use to write device drivers.
Apple is expected to unveil iOS 13, tvOS 13, macOS 10.15 and watchOS 6 on June 3rd, during the WWDC keynote address. Developers get access to the first beta immediately, with public betas coming later for members of the public beta program. The final version of the systems should be released to consumers in September.
Thanks to Steve Troughton-Smith for his help with this report.
FTC: We use income earning auto affiliate links. More.
Comments