4

When playing a video exported from a AVAssetExportSession, you hear audio long before seeing video. Audio plays right away, but video only appears after the recording loops several times (i.e., starts and finishes). In other words, you hear audio from the video multiple times before seeing any images.

We are using AutoLayout on iOS 8.

Using the following test, we isolated the problem to exportAsynchronouslyWithCompletionHandler. In both code blocks, we play an existing video -- not one related to the export -- so the export process has been eliminated as a variable.

Code 1 plays both video & audio at the start whereas Code 2 only plays audio at the start and shows video after a delay of 10-60 seconds (after the video loops several times).

The only difference between the two code blocks is one uses exportAsynchronouslyWithCompletionHandler to play the video while the other one does not.

Help? Is it possible the audio gets exported first and is ready to play before the video? Something to do with the export happening on a different thread?

func initPlayer(videoURL: NSURL) {
    // Create player
    player = AVPlayer(URL: videoURL)
    let playerItem = player.currentItem
    let asset = playerItem.asset
    playerLayer = AVPlayerLayer(player: player)
    playerLayer.frame = videoView.frame
    view.layer.addSublayer(playerLayer)
    player.seekToTime(kCMTimeZero)
    player.actionAtItemEnd = .None
    player.play()

    // Get notified when video done for looping purposes
    NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: playerItem)

    // Log status
    println("Initialized video player: \(CMTimeGetSeconds(asset.duration)) seconds & \(asset.tracks.count) tracks for \(videoURL)")
}

func playExistingVideo() {
    let filename = "/ChopsticksVideo.mp4"
    let allPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)
    let docsPath = allPaths[0] as! NSString
    let exportPath = docsPath.stringByAppendingFormat(filename)
    let exportURL = NSURL.fileURLWithPath(exportPath as String)!

    initPlayer(exportURL)
}

Code 1:

    // Create exporter
    let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality)
    exporter.videoComposition = videoComposition
    exporter.outputFileType = AVFileTypeMPEG4
    exporter.outputURL = exportURL
    exporter.shouldOptimizeForNetworkUse = true

    playExistingVideo()

Code 2:

    // Create exporter
    let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality)
    exporter.videoComposition = videoComposition
    exporter.outputFileType = AVFileTypeMPEG4
    exporter.outputURL = exportURL
    exporter.shouldOptimizeForNetworkUse = true

    // -- Export video
    exporter.exportAsynchronouslyWithCompletionHandler({
        self.playExistingVideo()
    })
Crashalot
  • 33,605
  • 61
  • 269
  • 439
  • @matt thanks for the comment. Yes asynchronous is well understood, but can you explain why the audio plays well before the video? If you read the question carefully, that is the crux of the issue, and the negativity is not appreciated. If it was merely an issue of an asynchronous delay, one would expect *both* audio and video to get delayed. In this case, only video is delayed while audio plays immediately ... – Crashalot Jul 05 '15 at 00:36
  • 1
    @matt with all due respect, if you refuse to read the question carefully, it seems unreasonable to challenge someone and leave a snarky remark. Regarding "docsPath.stringByAppendingFormat," that is a method of filename construction found on multiple SO posts. Since I'm new to Swift and iOS in general and had trouble early on with file URLs, I use that approach because several SO answers used that format. Again, if you read the question, "exportURL" is not relevant since all we're doing during the completion handler is calling playExistingVideo. – Crashalot Jul 05 '15 at 01:14
  • 1
    @matt while I appreciate you taking the time to comment, I do not appreciate the negativity. Even assuming that I didn't understand the meaning of asynchronous, I believe there could have been a more constructive response. You're undeniably a pillar of SO, far more important than me, but it's also because of your stature and frequent contributions that I think you carry an extra burden to promote a respectful, civil culture that less important members like myself will strive to emulate. Regardless, thanks for your other contributions and helping to make SO great. – Crashalot Jul 05 '15 at 01:21
  • @matt These are great questions. Sorry for the confusion. This code is extracted from a larger piece of functionality, and in boiling things down to a single question, I must have oversimplified. The goal: let users record a 10-second video and overlay text on the video. To accomplish this, I use `AVAssetExportSession`, but in playing back the video for the user to review, the audio plays long before images appear. After hours of debugging, I found that the problem persists even when using an existing video and that it seems related to the use of `exportAsynchronouslyWithCompletionHandler.` – Crashalot Jul 05 '15 at 02:44
  • @matt Hopefully that explains the two code blocks: it isolates the problem to `exportAsynchronouslyWithCompletionHandler,` meaning I can reproduce the playback issues (i.e., audio long before video) just by invoking `playExistingVideo` through `exportAsynchronouslyWithCompletionHandler.` If it helps, here's the full function (which provides more context but is code that doesn't seem to directly cause the problem and only makes the question longer to read without adding much): – Crashalot Jul 05 '15 at 02:47
  • Here's the code (SO didn't let me add it in the previous comment): https://gist.github.com/anonymous/cd98506416c97680b8cf. Thanks so much for your help. It's incredibly frustrating to learn Swift, iOS, AutoLayout, and the nuances of AVFoundation all at once, so any assistance would be most welcome. – Crashalot Jul 05 '15 at 02:55
  • Great diplomatic handling of some apparently difficult comments, which somebody seems to have removed from the public record. – user2067021 Oct 09 '17 at 22:40
  • @user2067021 thanks for the flattery! respectful discourse is very difficult, and i wish i could say i'm always so diplomatic. i actually believe it's net-positive to remove comments from public record because it more easily allows people to change and break from mistakes. (of course, it can be used for nefarious purposes, but all policies come with tradeoffs.) encouraging people to change instead of remaining the same is critical in an ever increasingly dynamic world. – Crashalot Oct 10 '17 at 02:55
  • Good point. I hadn't ever thought about that up-side of allowing users to remove their comments to ease the shift into a better frame. An apology never goes astray though, but I guess any incremental improvement is a win. – user2067021 Oct 12 '17 at 00:24

3 Answers3

4

I'm going to suggest that the problem is here:

    // Create player
    player = AVPlayer(URL: videoURL)
    let playerItem = player.currentItem
    let asset = playerItem.asset
    playerLayer = AVPlayerLayer(player: player)
    playerLayer.frame = videoView.frame
    view.layer.addSublayer(playerLayer)
    player.seekToTime(kCMTimeZero)
    player.actionAtItemEnd = .None
    player.play()

You see, when you create an AVPlayer from a video URL, it comes into the world not yet ready to play. It can usually start playing audio quite quickly, but video takes longer to prepare. This could explain the delay in seeing anything.

Well, instead of waiting for the video to be ready, you are just going ahead and saying play() immediately. Here's my suggestion. What I suggest you do is what I explain in my book (that's a link to the actual code): create the player and the layer, but then set up KVO so that you are notified when the player is ready to display, and then add the layer and start playing.

Also, I have one more suggestion. It seems to me that there is a danger that you are running that code, setting up your interface (with the layer) and saying play(), on a background thread. That is certain to cause delays of various kinds. You seem to be assuming that the completion handler from exportAsynchronouslyWithCompletionHandler: is being called on the main thread - and you are going straight ahead and calling the next method and so proceeding to set up your interface. That's a very risky assumption. In my experience you should never assume that any AVFoundation completion handler is on the main thread. You should be stepping out to the main thread with dispatch_async in your completion handler and proceeding only from there. If you look at the code I linked you to, you'll see I'm careful to do that.

matt
  • 515,959
  • 87
  • 875
  • 1,141
  • Yup, I already started looking into KVO as a solution (and unfortunately it's not as simple as we might think), but why does the video delay only happen with `exportAsynchronouslyWithCompletionHandler`? That is what's puzzling. Thanks again for your help! I have learned much about Swift already from your SO posts. :) – Crashalot Jul 05 '15 at 03:17
  • Just saw your edit: will try with `dispatch_async`. I didn't assume it was needed because of the asynch completion handler, but I'll look at your code. Thanks again! – Crashalot Jul 05 '15 at 03:18
  • You are reading faster than I can keep editing my answer, but I think I've said everything of any substance that I can think of. :) – matt Jul 05 '15 at 03:19
  • Sorry, very anxious to solve this problem :) Thanks again for your help. – Crashalot Jul 05 '15 at 03:20
  • No problem, I think I'm finished. Sorry I write so slowly. – matt Jul 05 '15 at 03:22
  • `dispatch_async` solved it. You are indeed a pillar of SO and saved my evening. Thanks again! – Crashalot Jul 05 '15 at 03:29
  • May I point out that we could have saved a lot of trouble if you had just showed your real code to start with. I understood your real code immediately. Your question code was just meaningless incoherent stuff. Just saying! – matt Jul 05 '15 at 03:32
  • Hahaha. In the past, people complained if I posted too much code so I was just trying to adhere to SO "best practices" in distilling the question to its essence. It took a lot of work to rewrite it, so believe me, I would have preferred to plop that whole chunk down in the first place. Maybe you can use your influence and start to change the culture about posting "too much" code. :) Thanks again for your help. – Crashalot Jul 05 '15 at 03:36
  • Incidentally, what is the right way to construct the filename if not `let exportPath = docsPath.stringByAppendingFormat(filename)` which I have seen used very commonly elsewhere on SO? – Crashalot Jul 05 '15 at 03:38
  • It's a path! This is what the path-related functions are for, such as [stringByAppendingPathComponent](https://developer.apple.com/library/ios/documentation/Cocoa/Reference/Foundation/Classes/NSString_Class/index.html#//apple_ref/occ/instm/NSString/stringByAppendingPathComponent:). What you're doing, adding the slash manually and using a format string when this is not a format, is so wrong. – matt Jul 05 '15 at 03:51
  • In this code snippet, https://github.com/mattneub/Programming-iOS-Book-Examples/blob/master/bk2ch15p669playerLayer/ch28p939playerLayer/ViewController.swift, why didn't you create the `AVPlayerItem` with the URL? It doesn't appear like you reference the `AVURLAsset` again. Is there an advantage I'm missing? – Crashalot Jul 07 '15 at 21:09
  • You're looking at the code from my book. But you're not looking at the book itself. It's like seeing pictures from a comic book with the dialog removed. – matt Jul 07 '15 at 21:42
  • @matt the link seems to be broken. – ViruMax Mar 07 '17 at 17:54
  • @matt I had this same issue and moved the the code to add the KVO observer: myLayer.layer.insertSublayer(playerLayer, at: 0). My question to you is since it's a subLayer isn't this supposed to get added in viewWillLayoutSubviews? That's where I initially had and moving it like your answer said did solve the issue though. – Lance Samaria Mar 18 '18 at 06:41
2

For those who stumble upon this question later, the answer was in the comments of the accepted answer. Key dispatch_async part below:

[exporter exportAsynchronouslyWithCompletionHandler:^(void){
    dispatch_async(dispatch_get_main_queue(), ^{
        switch (exporter.status) {
            case AVAssetExportSessionStatusCompleted:
                NSLog(@"Video Merge Successful");
                break;
            case AVAssetExportSessionStatusFailed:
                NSLog(@"Failed:%@", exporter.error.description);
                break;
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"Canceled:%@", exporter.error);
                break;
            case AVAssetExportSessionStatusExporting:
                NSLog(@"Exporting!");
                break;
            case AVAssetExportSessionStatusWaiting:
                NSLog(@"Waiting");
                break;
            default:
                break;
        }
    });

}];
Andrew Cross
  • 1,921
  • 2
  • 19
  • 32
0

You can play video composition during export.

playerItem = AVPlayerItem(asset: videoComposition)
player = AVPlayer(playerItem: playerItem)

Apple Docs

AVPlayerItem:

convenience init(asset: AVAsset)

Community
  • 1
  • 1
ugur
  • 3,604
  • 3
  • 26
  • 57