0

I'm using this code to create a movie from different uiimages with an AVAssetWriter. The codes works great but the problem is that the Alpha channel is gone when I add the images to the writer. I can't figure out if the alpha doesn't exists in the CVPixelBufferRef or that the AVAssetWriter isn't able to process.

My end result isn't a movie with an alpha channel but multiple images on top of each other and merged in a movie file. I can put images on top of other images in a single frame but all the images (pixel buffers) have a black background...

- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image andSize:(CGSize) size {
    @autoreleasepool {
        NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                                 [NSNumber numberWithBool:YES],     kCVPixelBufferCGImageCompatibilityKey,
                                 [NSNumber numberWithBool:YES],     kCVPixelBufferCGBitmapContextCompatibilityKey,
                                 nil];
        CVPixelBufferRef pxbuffer = NULL;

        CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
                                              size.height, kCVPixelFormatType_32ARGB,     (__bridge CFDictionaryRef) options,
                                              &pxbuffer);
        NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

        CVPixelBufferLockBaseAddress(pxbuffer, 0);
        void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
        NSParameterAssert(pxdata != NULL);

        CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8,     4*size.width, rgbColorSpace, (CGBitmapInfo)kCGImageAlphaPremultipliedFirst);

        NSParameterAssert(context);
        CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
        CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                               CGImageGetHeight(image)), image);
        CGColorSpaceRelease(rgbColorSpace);
        CGContextRelease(context);

        CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

        return pxbuffer;
    }
}
Michiel Timmerman
  • 353
  • 1
  • 3
  • 11
  • This is a shot in the dark, but what happens if you use `kCGImageAlphaFirst` instead of `kCGImageAlphaPremultipliedFirst` ? – Martin R Jan 19 '14 at 17:27
  • Yes I've tried all combinations. Some don't work and others need a fix in the CVReturn statement. If I use ARGB and change the status to kCVPixelFormatType_32BRGA the colors of the image are changed but still no transparency – Michiel Timmerman Jan 19 '14 at 18:11

1 Answers1

0

This is never going to work because h.264 does not support an alpha channel. You cannot encode a movie with an alpha channel using the built in iOS logic, end of story. It is possible to composite layers before encoding though. It is also possible to compose encode with a 3rd party library that does support an alpha channel. See this question for more info.

Community
  • 1
  • 1
MoDJ
  • 4,309
  • 2
  • 30
  • 65