0

I am working on an app that composes multiple video clips taken by the user. The clips are recorded on the camera, and overlayed with another video and then the recorded clips are composed together into one long clip. The length of each clip is determined by the overlaying video file.

I am using an AVAssetExportSession and exportAsynchronouslyWithCompletionHandler. The odd thing is this works with some clips and not others. The real problem is that the exporter doesn't report any errors or failures, just zero progress and never calls the completion handler.

I don't even know where to begin looking to find out what the issue is. Here's the function I use to compose the clips together

- (void) setupAndStitchVideos:(NSMutableArray*)videoData
{
    // Filepath to where the final generated video is stored
    NSURL                       *   exportUrl           = nil;
    // Contains information about a single asset/track
    NSDictionary                *   assetOptions        = nil;
    AVURLAsset                  *   currVideoAsset      = nil;
    AVURLAsset                  *   currAudioAsset      = nil;
    AVAssetTrack                *   currVideoTrack      = nil;
    AVAssetTrack                *   currAudioTrack      = nil;
    // Contains all tracks and time ranges used to build the final composition
    NSMutableArray              *   allVideoTracks      = nil;
    NSMutableArray              *   allVideoRanges      = nil;
    NSMutableArray              *   allAudioTracks      = nil;
    NSMutableArray              *   allAudioRanges      = nil;

    AVMutableCompositionTrack   *   videoTracks         = nil;
    AVMutableCompositionTrack   *   audioTracks         = nil;
    // Misc time values used when calculating a clips start time and total length
    float                           animationLength     = 0.0f;
    float                           clipLength          = 0.0f;
    float                           startTime           = 0.0f;
    CMTime                          clipStart           = kCMTimeZero;
    CMTime                          clipDuration        = kCMTimeZero;
    CMTimeRange                     currRange           = kCMTimeRangeZero;
    // The final composition to be generated and exported
    AVMutableComposition        *   finalComposition    = nil;

    // Cancel any already active exports
    if (m_activeExport)
    {
        [m_activeExport cancelExport];
        m_activeExport = nil;
    }

    // Initialize and setup all composition related member variables
    allVideoTracks      = [[NSMutableArray alloc] init];
    allAudioTracks      = [[NSMutableArray alloc] init];
    allVideoRanges      = [[NSMutableArray alloc] init];
    allAudioRanges      = [[NSMutableArray alloc] init];
    exportUrl           = [NSURL fileURLWithPath:[MobveoAnimation getMergeDestination]];
    finalComposition    = [AVMutableComposition composition];
    videoTracks         = [finalComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    audioTracks         = [finalComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    assetOptions        = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
    animationLength     = m_animation.videoDuration;

    // Define all of the audio and video tracks that will be used in the composition
    for (NSDictionary * currData in videoData)
    {
        currVideoAsset  = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_VIDEO_URL] options:assetOptions];
        currAudioAsset  = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_AUDIO_URL] options:assetOptions];
        currVideoTrack  = [[currVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

        NSArray *audioTracks = [currAudioAsset tracksWithMediaType:AVMediaTypeAudio];
        if ( audioTracks != nil && audioTracks.count > 0 )
        {
            currAudioTrack  = audioTracks[0];
        }
        else
        {
            currAudioTrack = nil;
        }

        clipLength      = animationLength * [(NSNumber*)[currData objectForKey:KEY_STITCH_LENGTH_PERCENTAGE] floatValue];
        clipStart       = CMTimeMakeWithSeconds(startTime, currVideoAsset.duration.timescale);
        clipDuration    = CMTimeMakeWithSeconds(clipLength, currVideoAsset.duration.timescale);

        NSLog(@"Clip length: %.2f", clipLength);
        NSLog(@"Clip Start: %lld", clipStart.value );
        NSLog(@"Clip duration: %lld", clipDuration.value);

        currRange       = CMTimeRangeMake(clipStart, clipDuration);
        [allVideoTracks addObject:currVideoTrack];

        if ( currAudioTrack != nil )
        {
            [allAudioTracks addObject:currAudioTrack];
            [allAudioRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
        }

        [allVideoRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
        startTime       += clipLength;
    }
    [videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:nil];

    if ( allAudioTracks.count > 0 )
    {
        [audioTracks insertTimeRanges:allAudioRanges ofTracks:allAudioTracks atTime:kCMTimeZero error:nil];
    }
    for ( int i = 0; i < allVideoTracks.count - allAudioTracks.count; ++i )
    {
        CMTimeRange curRange = [allVideoRanges[i] CMTimeRangeValue];
        [audioTracks insertEmptyTimeRange:curRange];
    }

    // Delete any previous exported video files that may already exist
    [[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil];

    // Begin the composition generation and export process!
    m_activeExport = [[AVAssetExportSession alloc] initWithAsset:finalComposition presetName:AVAssetExportPreset1280x720];
    [m_activeExport setOutputFileType:AVFileTypeQuickTimeMovie];
    [m_activeExport setOutputURL:exportUrl];
    NSLog(@"Exporting async");
    [m_activeExport exportAsynchronouslyWithCompletionHandler:^(void)
     {
         NSLog(@"Export complete");
         // Cancel the update timer
         [m_updateTimer invalidate];
         m_updateTimer = nil;

         // Dismiss the displayed dialog
         [m_displayedDialog hide:TRUE];
         m_displayedDialog = nil;

         // Re-enable touch events
         [[UIApplication sharedApplication] endIgnoringInteractionEvents];

         // Report the success/failure result
         switch (m_activeExport.status)
         {
             case AVAssetExportSessionStatusFailed:
                 [self performSelectorOnMainThread:@selector(videoStitchingFailed:) withObject:m_activeExport.error waitUntilDone:FALSE];
                 break;
             case AVAssetExportSessionStatusCompleted:
                 [self performSelectorOnMainThread:@selector(videoStitchingComplete:) withObject:m_activeExport.outputURL waitUntilDone:FALSE];
                 break;
         }

         // Clear our reference to the completed export
         m_activeExport = nil;
     }];
}

EDIT:

Thanks to Josh in the comments I noticed there were error parameters I wasn't making use of. In the case where it is failing now I am getting the ever so useful "Operation could not be completed" error on inserting the time ranges of the video tracks:

NSError *videoError = nil;
[videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:&videoError];

if ( videoError != nil )
{
    NSLog(@"Error adding video track: %@", videoError);
}

Output:

Error adding video track: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x17426dd00 {NSUnderlyingError=0x174040cc0 "The operation couldn’t be completed. (OSStatus error -12780.)", NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed}

It is worth noting however that nowhere in this entire codebase is urlWithString used instead of fileUrlWithPath so that isn't the problem.

Dan F
  • 17,654
  • 5
  • 72
  • 110
  • @JoshCaswell thanks for pointing that out, I'm not the original author of this cod, I tend to not ignore the error parameters of these kinds of functions. You can see in my edit the error that I am actually receiving now – Dan F Aug 05 '15 at 19:13
  • Are you attempting to live stream the export session? – ChrisHaze Aug 11 '15 at 10:44
  • @ChrisHaze No, the way the app works is the user records a series of clips, and then once that is done all the clips are composed into a single long video – Dan F Aug 11 '15 at 10:59
  • I didn't think so, but I just wanted to be sure. I have an idea as to what the issues is and I will try to give some accurate details in my answer – ChrisHaze Aug 11 '15 at 11:26

1 Answers1

0

Judging from your for in enumeration of the videoData array, after you've Initialized the composition member variables, it looks as if you're blocking the calling thread. Although accessing each AVAssetTrack instance is permitted, the values for the keys are not always immediately available and run synchronously..

Instead, try registering for change notifications using AVSynchronousKeyValueLoading protocols. Apple's documentation should help you straighten out the issue and get you on your way!

Here are a few more Apple recommendations I've aggregated for AVFoundation:

enter image description here

Hopefully this will do the trick! Good luck and let me know if you have any further questions/problems.

ChrisHaze
  • 2,800
  • 16
  • 20
  • I don't follow, what do you see that isn't multi-thread safe? – Dan F Aug 11 '15 at 12:25
  • I'm not entirely sure what was going wrong before, but it turned out that I was mistaken about some of the clips not working, and some working (shouldn't leave it up to the client to give me a full and accurate QA report). I ended up adding in a call to `loadValuesAsynchronouslyForKeys` and that miraculously fixed it. – Dan F Aug 12 '15 at 18:56
  • Apple's AVFoundation SDK isn't the most straight forward framework and I honestly couldn't tell you exactly why you were running into the problem either. I had come across a similar issue when building a video sharing concept and had to resort back to the WWDC video from 2011 where they referenced a few blocking issues. Glad it worked! - will I be rewarded the bounty? – ChrisHaze Aug 12 '15 at 19:19
  • 1
    I just wanna put the fix through the wringer today then once i'm sure its the right fix I'll award the bounty – Dan F Aug 13 '15 at 12:23
  • I've built mind maps of the entire AVFoundation framework. If you are familiar with Xmind (or theBrain) and want to check them out, in hopes of explaining the hangups, just let me know and I will email them to you. – ChrisHaze Aug 13 '15 at 13:17