Video On The iPhone Tutorial Part 3, playing multiple videos simultaneously

I come from a Flash background and something I’m use to doing a lot of in Flash is creating video walls (playing several videos at the same time). There are a lot of situations where you might want to do this, say having multiple video thumbnails playing, or a main video with another “talking head” video composited over the top, a video wall, and for all kinds of creative, arty and special effects. For a recent client project I had to do exactly this. Initially I thought this was going to be an easy job, just smash out a few MPMoviePlayers, feed them some video and job done. But it wasn’t to be that easy, there is no way of playing multiple videos at the same time using MPMediaFramework, I really wish there was, but there’s not.
So to solve this problem we have to dig a bit deeper and use the AVFoundation Framework. It’s a lower level framework that’s used to develop Final Cut Pro and iMovie, but don’t let that scare you. In this tutorial I’m going to show you a quick and dirty way of getting multiple videos playing with the least amount of effort. I’ve included the working files at the end. This tut follows on from part 2 that can be found here

(Before you start, make sure you add the AVFoundation.framework to your project, otherwise it won’t work, it has been added to the example project)

Start by creating a view based application from the XCode default templates as we have in the previous examples and get your .h file looking like this:

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
@class AVPlayerDemoPlaybackView;
@class AVPlayer;

@interface AVFoundation_Test3ViewController : UIViewController {
    AVPlayer* mPlayer;
    AVPlayer* mPlayer2;
    IBOutlet AVPlayerDemoPlaybackView  *mPlaybackView;
    IBOutlet AVPlayerDemoPlaybackView  *mPlaybackView2;
}

@property (readwrite, retain) AVPlayer* mPlayer;
@property (readwrite, retain) AVPlayer* mPlayer2;
@property (nonatomic, retain) IBOutlet AVPlayerDemoPlaybackView *mPlaybackView;
@property (nonatomic, retain) IBOutlet AVPlayerDemoPlaybackView *mPlaybackView2;

- (void)observeValueForKeyPath:(NSString*) path ofObject:(id)object change:(NSDictionary*)change context:(void*)context;

@end

We’ve created two instances of AVPlayer, and two IBOutlets that we’re going to hook up to via our .xib file, except as you can see they’re a custom class. So we’re going to extend UIView so it acts like a videoPlayer and associate it with these UIViews in the .xib file which we are then going to hook up to this class to play the video. First lets look at the .m file of class view before we extend our UIViews:

#import "AVFoundation_Test3ViewController.h"
static void *AVPlayerDemoPlaybackViewControllerStatusObservationContext = &AVPlayerDemoPlaybackViewControllerStatusObservationContext;
@implementation AVFoundation_Test3ViewController
@synthesize mPlayer, mPlaybackView, mPlaybackView2, mPlayer2;

- (void)dealloc
{
    [super dealloc];
}

- (void)viewDidLoad
{
    [super viewDidLoad];
    NSURL *url = [NSURL URLWithString:@"http://www.samkeeneinteractivedesign.com/videos/littleVid3.mp4"];
    self.mPlayer = [AVPlayer playerWithURL:url];
    self.mPlayer2 = [AVPlayer playerWithURL:url];
    [mPlayer addObserver:self forKeyPath:@"status" options:0 context:AVPlayerDemoPlaybackViewControllerStatusObservationContext];
}

- (void)observeValueForKeyPath:(NSString*) path ofObject:(id)object change:(NSDictionary*)change context:(void*)context
{
    if (mPlayer.status == AVPlayerStatusReadyToPlay) {
        [self.mPlaybackView2 setPlayer:self.mPlayer2];
        [self.mPlaybackView setPlayer:self.mPlayer];
        [self.mPlayer play];
        [self.mPlayer2 play];
    }
}
@end

In our viewDidLoad method we create a NSUrl and then below that we instanitate two AVPlayers.
To one of these players we’ve added an observer, and I stress this is the dirty part of the code, you generally would not want one observer controlling the state of two seperate objects, however for this tut it’s fine as we’re trying to keep it simple and easy, and it works.
Then we have the selector that is triggered when that observer changes.
We need to test if the AVPlayer’s status is ready to play, we do this because there are multiple other circumstances in which this observer could be triggerd.
Then we set the UIViews in the xib to be the player of our AVPlayers.

Now lets take a look at the code that we use to extend our UIViews. In the XIB file associated with the view in your view based project you’ll want to add two UIViews and connect them up to the IBOutlets in the previous class. Then in Interface Builder you need to set their class to be a custom class I’ve called it AVPlayerDemoPlaybackView , but before you can set do that we need to create the class. Here’s the .h file:


#import <UIKit/UIKit.h>
@class AVPlayer;

@interface AVPlayerDemoPlaybackView : UIView

@property (nonatomic, retain) AVPlayer* player;

- (void)setPlayer:(AVPlayer*)player;
- (void)setVideoFillMode:(NSString *)fillMode;

@end

and here’s the .m file:


#import "AVPlayerDemoPlaybackView.h"
#import <AVFoundation/AVFoundation.h>



@implementation AVPlayerDemoPlaybackView

+ (Class)layerClass
{
	return [AVPlayerLayer class];
}

- (AVPlayer*)player
{
	return [(AVPlayerLayer*)[self layer] player];
}

- (void)setPlayer:(AVPlayer*)player
{
	[(AVPlayerLayer*)[self layer] setPlayer:player];
}

/* Specifies how the video is displayed within a player layer’s bounds. 
	(AVLayerVideoGravityResizeAspect is default) */
- (void)setVideoFillMode:(NSString *)fillMode
{
	AVPlayerLayer *playerLayer = (AVPlayerLayer*)[self layer];
	playerLayer.videoGravity = fillMode;
}

@end

So that’s pretty straight forward. We’re basically extending the UIView so it contains an AVPlayerLayer to which the output of the AVPlayer object from our previous class can be directed.
And that’s pretty much it, remember you don’t have to stream the video from an external URL, you can load videos from the application bundle.
You can grab the full working project here

25 comments

    1. Hey no worries Elias, thanks for the kind words. Hopefully I can help people solve tricky iOS problems.

      Reply

  1. Awesome. I ran into major problems with MPMoviePlayerViewController when displaying several movies on screen at the same time.

    Unfortunately the AVPlayer does not have any controls, buttons etc so these have to be added. But, that is a minor inconvenience compared to the hours of work lost on the MPMoviePlayerViewController πŸ™‚

    Reply

    1. Hey Martin, true there are no standard visual components, but it’s a lot more powerful, although MPMoviePlayerViewController is good if you just want to quickly play a video.

      Reply

  2. Hi Martin, I am wanting to verify that Apple will approve an ipad app with four videos running at the same time. Do you know of any approved apps in the store you can point me to? Thanks for the code.

    Reply

    1. Hey Brian,
      Apple won’t explicitly reject an app because it has multiple video players. However in most situations you should only use one player, and it should be the standard MediaPlayer Framework. If your app is sucking too much CPU or breaking Apple’s useability guidelines because of multiple video players then it will be rejected. A good example of an app that uses multiple small video players/loops is this one by Dove: http://itunes.apple.com/au/app/dove-body-language-messenger/id457821773?mt=8
      In this app a clock is created with videos of human figures, creative functionality that could only be executed with multiple small video players.

      Reply

  3. Nice tutorial!

    First of all, sorry for my English, it is not so clean as i want πŸ™

    I have a very similar to tutorial question, may be there is some answer to it… or any links

    Can you share any thoughts about such task: I have only one video stream, but video is devided into 6 parts (3×2 or 2×3 – doesn’t matter). I want to display first part in big container and other 5 parts and also first part in small containers as some kind of video-thumbnails. User can click at any small container and appropriate video appears in big container.

    I did all to get all to work (click-thumbnails actions, set size of video frame to show in big container and so on). The only thing is missing – is to show video in small containers.

    I tried to use your code with some modifications. I added 6 “mPlayers” (AVPlayer instances) and set all to first mPlayer ( self.mPlayer2 = self.mPlayer .. self.mPlayer5 = self.mplayer, etc) But iOS shows only one instance of player and in http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html this limitation is mentioned : ” You can create arbitrarily many AVPlayerLayer objects from a single AVPlayer instance, but only the most-recently-created such layer will display any video content on-screen. ”

    The video stream bitrate is near 5mbps and no way to use 6 different AVPlayer instances (30 mb per second is too much for end-user). Also in simulator there are always synchronization problems. The video shows people talking and very good synchronization is needed.

    At this moment i make server-side screenshots of video stream and place them in 6 small containers in application. But i hope it is clean that it is very ugly solution and i seek some other solutions.

    May be there are some methods to place same stream in many places? Or take transport stream (http m3u8) screenshots iOS-side?

    Thanks!

    Reply

    1. Hi Artem,
      Interesting problem. So just to clarify, you have one video stream which has 6 sections that you want to be able to jump to from 6 video thumbnails. My biggest question is why do the thumbnails have to be video? Could you not just use stills, or short animated Gifs? I’m assuming this all has to be created dynamically with multiple video feeds (if it wasn’t you could just create little looping video thumbs and embed them in the app). Although this could probably be done in AVFoundation framework on the iPhone, I wouldn’t attempt it as it would be too CPU intensive and take some time for the iPhone to analyse the video feed, break it up, and create thumbnails every time the user connects to a new video feed. You’ll need to do some server side trickery, check out FFFMPEG, a server side video editing application that allows you to chop up, manipulate and compress videos and then send them off as a single stream. If you have specific cue points in your video it could analyse those points create a short animated gif of each, place it on the server for your app to display as thumbnails.
      So I think what you want to do needs to be done server side, you don’t want your app processing that much data.

      Reply

  4. Hi,

    Thanks for this.

    How many videos can you display at the same time ?
    Even with 3 my app gets a “low memory warning” (all I do is play 3 videos, no other assets, so the app itself uses very little memory). They still play, but with 4 or 5 it starts to give up.

    Also, when the app goes into the background I pause the videos. On becoming active again I re-start them, but only 1 or 2 actually restart. Any idea why ?

    Thanks,
    Andy

    Reply

    1. Hey Andy,
      I had no problem running four videos at the same time on iPhone 4, full screen of video. What iPhone/ iOS are you running and what codec/compression? I think I was using QT H264.
      I had the same problem on app relaunch with some of the videos not appearing. Unfortunately the client I was building the app for pulled the plug and I never got to solve that problem. My sense is that it’s some kind issue with AVFoundation queues not being triggered properly when the app enters the back ground and becomes inactive.
      i think the first thing I’d do would be to assume the previously running AVFoundation classes will be buggy, and basically remove everything and reload them all again on applicationDidBecomeActive.
      S

      Reply

  5. Hi samkeeneinteractivedesign,

    The 3 videos run OK (iPhone 4), but the app does get the low memory notification.
    I’m running them in table view cells.
    I “pause” the videos myself on going into the background, and “play” them on becoming active again.
    Going to 5 doesn’t work – not all of the videos start.

    I agree – reloading on becoming active is probably the way to go.

    Cheers,
    A

    Reply

  6. Hi

    Firstly thanks for the tutorial! I’ve managed to get it to work on the simulator, but after a while the video freezes when running on my phone. Instruments shows a memory allocation of about 45MB, even though the clips are very small. Is anyone else having this issue?

    Thanks
    Noel

    Reply

  7. Nice tutorial ..
    I have some question…
    Can AVPlayer stop,
    I didnt found [player stop] in it’s property
    How can I stop player in project ?

    the other question….
    if I played movie in the end
    how can I play movie again … I used [player play] but it didnt work
    Can you help me?
    Thanks a lot and sorry for my weak english.

    Reply

  8. Does similar option exist for Android ? Can you play multiple videos simultaneously in Android tablets ?

    Reply

  9. Great tutorial thanks a lot! However i got one question. How can one play a video as a background of other view while one can just use the functions of the overlaying view? Thanks a lot

    Reply

  10. Hey,

    I’m having an issue using this code in a more “layer management” kind of way.
    I have an class method from a Helper class which gives me back ->
    (AVPlayerDemoPlaybackView *)videoWithPathName:(NSString *)videoName {
    NSBundle *bundle = [NSBundle mainBundle];
    NSString *moviePath = [bundle pathForResource:videoName ofType:@”mp4″];
    NSURL *movieURL = [NSURL fileURLWithPath:moviePath];

    AVPlayer *player = [AVPlayer playerWithURL:movieURL];

    AVPlayerDemoPlaybackView *movieController = [[AVPlayerDemoPlaybackView alloc] init];
    [movieController setPlayer:player];
    //[movieController.player play];

    return movieController;
    }

    In the main class, I add it to the subview, set it’s frame and be done with it
    AVPlayerDemoPlaybackView *layerView = [LayerDataHelper videoWithPathName:vjlayer.name];
    [layerView.player addObserver:self forKeyPath:@”status” options:0 context:AVPlayerDemoPlaybackViewControllerStatusObservationContext];
    [layerView setFrame:self.layerContainer.bounds];

    It actually goes into the play statement, but it just freezes at frame 1 (I had it with MPMoviePlayerViewController to, fixed that, but now have it on your class to).
    Any idea what’s up? (I want to display several on top of each other.

    Reply

  11. Hey,

    I want to get frames from avplayer in an interval of 1/30 second.

    I tried many time but could not succeed.

    Please help me.

    A lot of thanks in advance

    Reply

  12. Very nice post. I just stumbled upon your blog and wished to say that I’ve truly enjoyed surfing around your
    blog posts. After all I will be subscribing to your feed and
    I hope you write again very soon!

    Reply

  13. Hey frnd I dnt understand how to perform this .can you explain me in easy steps or a video

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *