3

After recording a video, I want to overlay it with a dynamic uiview which is updated related to the current video frame timestamp. In fact I am trying to do the same thing as Vidometer application. Following the Apple example, I am able to extract the video frame as a buffer and I overlay it with the UIImage of my UIView.

The steps are :

  1. Extract video frame:

    CMSampleBufferRef sampleBuffer = [assetReaderVideoOutput copyNextSampleBuffer];
    if (cancelled) {
        completedOrFailed = YES;
        [assetWriterVideoInput markAsFinished];
    }
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    
  2. Update my UIView's subviews following the frame timestamp:

    mergeTime = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer));
    [self updateWidget:mergeTime*1000];
    
  3. Get the UIImage of my UIView:

    mSurcoucheImage = [self imageFromSurcouche];
    

    With

    -(UIImage*)imageFromSurcouche{
    CGSize mSize;
    mSize = CGSizeMake(self.mSurcouche.bounds.size.width, self.mSurcouche.bounds.size.height);
    UIGraphicsBeginImageContextWithOptions(mSize, NO, 0.0);
    if (videoOrientation == UIInterfaceOrientationLandscapeRight) {
        CGContextTranslateCTM(UIGraphicsGetCurrentContext(), mSize.width, mSize.height);
        CGContextRotateCTM(UIGraphicsGetCurrentContext(), M_PI);
    }
    [self.mSurcouche.layer renderInContext:UIGraphicsGetCurrentContext()];
    UIImage* image = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
    return image;
    }
    
  4. Apply a filter to keep the alpha of my UIImage:

    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
                            inputImage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:options];
    filter = [CIFilter filterWithName:@"CISourceOverCompositing"];
    [filter setValue:maskImage forKey:kCIInputImageKey];
    [filter setValue:inputImage forKey:kCIInputBackgroundImageKey];
    outputImage = [filter outputImage];
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    

    With

    colorSpace = CGColorSpaceCreateDeviceRGB();
    options = [NSDictionary dictionaryWithObject:(__bridge id)colorSpace forKey:kCIImageColorSpace];
    
  5. Render my UIImage to the video frame buffer

    [ciContext render:outputImage toCVPixelBuffer:pixelBuffer bounds:[outputImage extent] colorSpace:CGColorSpaceCreateDeviceRGB()];
    

    With

    eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
    ciContext = [CIContext contextWithEAGLContext:eaglContext options:@{kCIContextWorkingColorSpace : [NSNull null]}];
    if (eaglContext != [EAGLContext currentContext])
       [EAGLContext setCurrentContext:eaglContext];
    

I am trying to reach the same performance as Vidometer application which have an overlay duration less than the video duration, but I am far with this method.

Question 1: Is this method the best performing?

Question 2: A also see another method which use AVMutableComposition, but I don't think that I can synchronize my UIView with the video frame timestamps. Can I?

0 Answers0