0

I am trying to create a video from an array of images.

I created a video but I have a problem in presentation time i.e CMTime.

I am using following code to create a video

   int frameCount = 1;

    // Adding images here to buffer
    for( int i = 0; i<[ fileArray count]; i++ )
    {
        // Create Pool
        NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

        NSString *imageTag = [fileArray objectAtIndex:i];
        // Create file path
        NSString *imgPath = [folderPath stringByAppendingPathComponent:imageTag];
        // Get image
        UIImage *img = [self getImageFromPath:imgPath];
        buffer = [self pixelBufferFromCGImage:[img CGImage] size:size];

        BOOL append_ok = NO;
        int j = 0;

        // Try 5 times if append failes
        while ( !append_ok && j < 30 )
        {
            if (adaptor.assetWriterInput.readyForMoreMediaData)
            {
                printf("appending %d attemp %d\n", frameCount, j);

                NSTimeInterval duration = 7.0;
                if ( [mArrAudioFileNames objectAtIndex:i] != [NSNull null] )
                {
                    // Get Audio file
                    NSString *docsDir = [[self dataFolderPathForAudio]
                                         stringByAppendingPathComponent:
                                         [mArrAudioFileNames objectAtIndex:i]];

                    NSURL *soundFileURL = [NSURL fileURLWithPath:docsDir];

                    // Create AudioPlayer
                    NSError *error;
                    AVAudioPlayer  *audioPlayer = [[AVAudioPlayer alloc]
                                                   initWithContentsOfURL:soundFileURL
                                                   error:&error];

                    // Get Audio duration
                    duration = [audioPlayer duration];
                    [audioPlayer release];
                }

                CMTime frameTime = CMTimeMake(frameCount,(int32_t)1);
                append_ok = [adaptor appendPixelBuffer:buffer
                                  withPresentationTime:frameTime];

                [NSThread sleepForTimeInterval:0.05];
            }
            else
            {
                printf("adaptor not ready %d, %d\n", frameCount, j);
                [NSThread sleepForTimeInterval:0.1];
            }
            j++;
        }
        if (!append_ok) {
            printf("error appending image %d times %d\n", frameCount, j);
            isError = YES;
        }
        frameCount++;
        CVBufferRelease(buffer);

        // drain the pool
        [pool drain];
    }

    [videoWriterInput markAsFinished];
    [videoWriter finishWriting];

    [videoWriterInput release];
    [videoWriter release];

This creates a video of 7 seconds if I have 7 images in an array. That means each image plays for 1 second.

My question is: how can I make a video with the same set of images (suppose 7), where each image showing-time in the video is different?

Like if the first image plays for 9 seconds from total video duration, then the second image for 5 and a third image for 20 seconds.

Thanks in advance.

Lundin
  • 195,001
  • 40
  • 254
  • 396
subhash Amale
  • 325
  • 3
  • 4
  • 14
  • `[mArrAudioFileNames objectAtIndex:i] != [NSNull null]` does not do what you think it does. –  Jun 13 '13 at 06:40
  • @H2CO3 Actually I have some audio files too, So I want to get the duration of Audio file at that index and play that image for the duration of audio file duration. It means if audio file at index i has duration of 10 seconds the image at that index should show for 10 seconds in a created video – subhash Amale Jun 13 '13 at 07:06

0 Answers0