A Tour of the iOS Media Frameworks
You're rocking out to your favorite music with your iPod touch, watching a favorite movie in the Netflix app, using iMovie to edit and upload some video you just shot, or amusing your friends by auto-tuning your voice. All of these are common functions of iOS apps, from Apple and third parties alike, and can be found in abundance on millions iPhones, iPod touches, and iPads.
But how do they get there? Or to put it another way, how do you get there? How do you go from an idea for an app that uses audio and/or video to a five-star App Store masterpiece? You could start by combing the documentation bundles in Xcode, but when you discover that video gets you more than 200 hits and audio nearly 1,000, you quickly discover this style of programming is not one of quick and easy wins.
Just for dynamic media like audio and video—setting aside other kinds of media like still images—iOS offers four major frameworks:
- Media Player
- AV Foundation
- Core Audio
- Open AL
In this article, I’ll look at the purposes, promise, and perils of each. Keep in mind that none of these frameworks are included by default with Xcode's project starter templates, so you will need to #import them prior to using them in code, and link the frameworks to your binary under the project's Build Phases tab.
Media Player
To start with a specific use case, imagine that you are writing a game that allows the user to play his or her own music as the background audio. Since iOS 3.0, third-party apps have had access to the music library via the Media Player framework. The idea of this framework is that all audio items in the music library—songs, podcasts, audiobooks—can be represented as MPMediaItems. You can find these items by searching the library via a MPMediaQuery. You typically build a query by filtering down to one kind of item (podcasts, audiobooks, etc.), and then apply filter predicates to further winnow down our search. To search for all songs by one artist—let's say The Kinks, because they're a classic—you might do something like this:
MPMediaQuery *mySongsQuery = [MPMediaQuery songsQuery]; MPMediaPredicate *kinksArtistPredicate = [MPMediaPropertyPredicate predicateWithValue: @"The Kinks" forProperty: MPMediaItemPropertyArtist]; [mySongsQuery addFilterPredicate: kinksArtistPredicate]; NSArray *songs = [mySongsQuery items];
The matching MPMediaItems, if any, will be in the songs array, ordered alphabetically by song title. This ordering is a result of using the songsQuery convenience constructor. If it would be more useful to collect the Kinks' albums, you might write the query like this:
MPMediaQuery *myAlbumsQuery = [MPMediaQuery albumsQuery]; MPMediaPredicate *kinksArtistPredicate = [MPMediaPropertyPredicate predicateWithValue: @"The Kinks" forProperty: MPMediaItemPropertyArtist]; [myAlbumsQuery addFilterPredicate: kinksArtistPredicate]; NSArray *albums = [myAlbumsQuery collections];
This time, because of the grouping performed by the albumsQuery, the results can be retrieved with the query's collections method, which returns MPMediaItemCollection objects, one for each album. You can then iterate through these to get the songs from each album, in order. You can also get metadata properties from each item, such artist/song/album strings, cover art images, etc.
Sure, but what do you do with them? The Media Player framework has a MPMusicPlayerController class that accepts queues of MPMediaItems to play, either as MPMediaItemCollection or MPMediaQuery objects. This class exposes two actual media players: applicationMusicPlayer will play MPMediaItems solely within the context of your own application. Contrast this with the iPodMusicPlayer, which represents the built-in "Music" app. You can query the iPodMusicPlayer to see what music the user is already playing through that app, or send it a new queue of items to play.