Skip to main content

iOS 7’s new Inter-App Audio introduces universal audio routing between apps

Screen Shot 2013-06-12 at 11.15.05 AM

Following WWDC this week Apple’s redesign of iOS 7 and Mavericks have been getting the majority of attention, and rightfully so, but there are a few new big features coming in iOS 7 that haven’t been discussed. This morning we told you about the new MFi Game Controller framework that will make using hardware game controllers a lot smoother in iOS 7, but another important new feature in the update is inter-app audio.

The basic idea is simple: Inter-app audio will allow developers to make their app act as an output and or input for sending and receiving audio to and from other apps. In fact, we already somewhat have that functionality through third-party iOS app Audiobus. However, with Apple’s new inter-app audio feature available to devs, apps will no longer have to use a third-party app like Audiobus to send audio to one another. At first glance it seems to make Audiobus obsolete, an interesting move after Apple just recently implemented support for the third-party service in its own GarageBand app. Either way, it means a ton of new possibilities for creating music and sharing audio on your iPhone and iPad are on the way with the update to iOS 7 this fall.

We dug into Apple’s documentation on Inter-App Audio to find out how it works and also spoke with Audiobus about what this means for them:

As an example, in Apple’s description of inter-app audio (below), GarageBand would act as a “host” application, allowing it to receive audio from any audio app that is enabled as an output “node”:

Inter-app-audio allows iOS audio applications that are remote instruments, effects or generators to publish an output which can be used by other audio applications. These applications which publish an output are known as nodes. Any application which connects and utilizes these node applications is known as a host.

Inter-app-audio-iconIn Apple’s pre-release developer documentation for iOS 7, it also outlines two implementations of inter-app audio including “InterAppAudioDelay” and an “InterAppAudioSampler”.

One example, InterAppAudioDelay is for effects, such as delay:

InterAppAudioDelay allows apps to publish their audio capabilities so that they can be used in conjunction with other audio apps in iOS. This example illustrates how to publish and control a delay effect which can be used by another app. On its own this application will not make produce any sound, it must be used in conjunction with the InterAppAudioHost and InterAppAudioSampler apps.

Another example is for a node application used as a sampler:

InterAppAudioSampler is an example of a node application that is a sampler. This demo shows a sampler that publishes itself as a remote instrument and a generator. If InterAppAudioSampler is connected to as a remote instrument, it can plays audio upon receiving midi events from a host application.

Audiobus doesn’t seem to be too worried about Inter-App Audio from a post on its blog congratulating Apple’s work. It does, however, admit that inter-app audio brings “new functionality, which has some terrific features that only a team at Apple with system-level access could achieve.” Perhaps that’s referring to Inter-App Audio’s “MIDI control of audio rendering” with the ability to remotely launch “other registered Inter-App Audio apps and more.”

A few other things that Audiobus provides that doesn’t seem to be included in Inter-App Audio: a standardized interface, a control panel for triggers (instead inter-app has basic transport controls that have to be mapped to buttons inside each app), and Audiobus is even hinting at device to device connections in its blog post, another feature not included with Inter-App Audio.

Audiobus-iconWe’ve reached out to Audiobus about what this means for them and they told us Audiobus will continue offering great, new features for devs and musicians that won’t be available with Inter-App Audio alone:

Apple’s CoreAudio team has done some amazing work. However, Audiobus has some features that Inter-App Audio does not have and vice versa, but I’m not sure if I can disclose those since I’m under NDA until iOS 7 is released. It’s a completely different architecture even from a user’s perspective… Some users might rely on a certain Audiobus workflow that cannot be replicated with Inter-App Audio.

We’re actually pretty excited for the new features of iOS 7 and how we’re going to be able to use them to build a great tool for musicians. We might even be able to help developers integrate Inter-App Audio while having access to Audiobus specific features but it’s too early to be specific or certain.

Developers interested in implementing the feature will have to “publish a AURemoteIO instance as an audio component that is visible to other processes” in order to act as a node, or “to use audio features from another app, use the audio component discovery interfaces in iOS 7.”

Apple has published examples in its iOS Developer Library here.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Jordan Kahn Jordan Kahn

Jordan writes about all things Apple as Senior Editor of 9to5Mac, & contributes to 9to5Google, 9to5Toys, & Electrek.co. He also co-authors 9to5Mac’s Logic Pros series.