9

I'm sure something's wrong with my buffer attributes, but it's not clear to me what -- it's not well documented what's supposed to go there, so I'm guessing based on CVPixelBufferPoolCreate -- and Core Foundation is pretty much a closed book to me.

    // "width" and "height" are const ints
    CFNumberRef cfWidth = CFNumberCreate(kCFAllocatorDefault, kCFNumberIntType, &width);
    CFNumberRef cfHeight = CFNumberCreate(kCFAllocatorDefault, kCFNumberIntType, &height);

    CFStringRef keys[] = {
        kCVPixelBufferWidthKey,
        kCVPixelBufferHeightKey,
        kCVPixelBufferCGImageCompatibilityKey
    };
    CFTypeRef values[] = {
        cfWidth,
        cfHeight,
        kCFBooleanTrue
    };
    int numValues = sizeof(keys) / sizeof(keys[0]);

    CFDictionaryRef bufferAttributes = CFDictionaryCreate(kCFAllocatorDefault, 
                                                          (const void **)&keys, 
                                                          (const void **)&values,
                                                          numValues,
                                                          &kCFTypeDictionaryKeyCallBacks,
                                                          &kCFTypeDictionaryValueCallBacks
                                                          );

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [[AVAssetWriterInputPixelBufferAdaptor 
                                                      assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                      sourcePixelBufferAttributes:(NSDictionary*)bufferAttributes] retain];
    CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
    NSParameterAssert(bufferPool != NULL); // fails
David Moles
  • 48,006
  • 27
  • 136
  • 235
  • Hi, I have the same problem, you found a solution ? – TheRonin May 18 '11 at 16:16
  • Not really. I'm just creating a pixel buffer for every frame instead of using the pool. :( – David Moles May 18 '11 at 19:43
  • Ok, we have found the same solution. Thanks ! – TheRonin May 19 '11 at 07:06
  • Hi @DavidMoles did you found the solution, or do you have working code with "creating a pixel buffer for every frame" ? – Iraniya Naynesh Sep 22 '19 at 15:06
  • @IraniyaNaynesh Sorry, I haven't looked at this in years. But the [docs for pixelBufferPool](https://developer.apple.com/documentation/avfoundation/avassetwriterinputpixelbufferadaptor/1389662-pixelbufferpool) now say “This property is NULL before the first call to startSessionAtTime:on the associated AVAssetWriter object.” So maybe that was the issue? – David Moles Sep 22 '19 at 17:53

6 Answers6

18

When the pixelBufferPool returns null, check the following:

    1. the output file of the AVAssetsWriter doesn't exist.
    2. use the pixelbuffer after calling startSessionAtTime: on the AVAssetsWriter.
    3. the settings of AVAssetWriterInput and AVAssetWriterInputPixelBufferAdaptor are correct.
    4. the present times of appendPixelBuffer uses are not the same.
roamer
  • 326
  • 2
  • 5
  • 1
    #1 did the trick for me too. Such a simple solution! I was really looking too deep. BTW, my failure was when I called CVPixelBufferPoolCreatePixelBuffer it was returning -6661. – VaporwareWolf Mar 24 '16 at 21:14
  • 1
    #1 did it for me, too. Thanks! – Trevor Alyn May 23 '16 at 21:39
  • how did you guys fix this with number one? I'm using NSHomeDirectory in swift. Can someone post a github example? – rocky raccoon Apr 17 '17 at 19:35
  • #1 for me as well. – Rocket Garden Aug 30 '18 at 21:46
  • I don't understand #1 - Why would a file have to exist? If I'm recording a new video, it may not exist. I've never had to "create" the file first on any other app I've made. --- UPDATE --- you will get the error if the file DOES exist and you are trying to write over it. Try renaming your destination file. – JCutting8 Jun 14 '20 at 03:03
3

I had the same problem, and I think it is possibly because you have not configured your AVAssetWriterInput correctly. My pool started working after I had done this. In particular, the pool would not give me pixel buffers unless I had provided data in AVVideoCompressionPropertiesKey. First, create and fully configure the AVAssetWriter ( Look in /Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS4.3.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoSettings.h for keys & values for outputSettings and compressionSettings):

NSError * err = 0;
AVAssetWriter * outputWriter = [AVAssetWriter
    assetWriterWithURL: [NSURL fileURLWithPath:outputPath]
              fileType: AVFileTypeAppleM4V
                 error: & err];

NSMutableDictionary * outputSettings
    = [[NSMutableDictionary alloc] init];
[outputSettings setObject: AVVideoCodecH264
                   forKey: AVVideoCodecKey];
[outputSettings setObject: [NSNumber numberWithInt: width_]
                   forKey: AVVideoWidthKey];
[outputSettings setObject: [NSNumber numberWithInt: height_]
                   forKey: AVVideoHeightKey];

NSMutableDictionary * compressionProperties
    = [[NSMutableDictionary alloc] init];
[compressionProperties setObject: [NSNumber numberWithInt: 1000000]
                          forKey: AVVideoAverageBitRateKey];
[compressionProperties setObject: [NSNumber numberWithInt: 16]
                          forKey: AVVideoMaxKeyFrameIntervalKey];
[compressionProperties setObject: AVVideoProfileLevelH264Main31
                          forKey: AVVideoProfileLevelKey];

[outputSettings setObject: compressionProperties
                   forKey: AVVideoCompressionPropertiesKey];

AVAssetWriterInput * writerInput = [AVAssetWriterInput
    assetWriterInputWithMediaType: AVMediaTypeVideo
                   outputSettings: outputSettings];

[compressionProperties release];
[outputSettings release];

Create the pixel buffer adaptor:

NSMutableDictionary * pixBufSettings = [[NSMutableDictionary alloc] init];
[pixBufSettings setObject: [NSNumber numberWithInt: kCVPixelFormatType_32BGRA]
                   forKey: (NSString *) kCVPixelBufferPixelFormatTypeKey];
[pixBufSettings setObject: [NSNumber numberWithInt: width_]
                   forKey: (NSString *) kCVPixelBufferWidthKey];
[pixBufSettings setObject: [NSNumber numberWithInt: height_]
                   forKey: (NSString *) kCVPixelBufferHeightKey];

AVAssetWriterInputPixelBufferAdaptor * outputPBA =
    [AVAssetWriterInputPixelBufferAdaptor
    assetWriterInputPixelBufferAdaptorWithAssetWriterInput: outputInput
                               sourcePixelBufferAttributes: nil];

Then retrieve pixel buffers from its pool using:

CVReturn res = CVPixelBufferPoolCreatePixelBuffer (NULL
    , [outputPBA pixelBufferPool]
    , & outputFrame);
2

According to the documentation:

"This property is NULL before the first call to startSessionAtTime:on the associated AVAssetWriter object."

So if you'e trying to access the pool too early, it will be NULL. I'm just learning this stuff myself so I can't really elaborate at the moment.

Bored Astronaut
  • 994
  • 5
  • 10
  • what do you mean for too early?? I can assure you that I call CVPixelBufferPoolCreatePixelBuffer (kCFAllocatorDefault, _pixelBufferAdaptor.pixelBufferPool, &pixelBuffer) after calling methods supposed to be called on the assetWriter and, still, it keeps returning error cause apparently no buffer pool has been created – HepaKKes May 14 '14 at 14:36
1

For every one still looking for the solution: Firstly, make sure your AVAssetWriter is working properly by checking it's status. I've had this problem and after checking the status, although I've call start some what, the writer just haven't started yet.(In my case, I've point the writing path to an existing file, so after deleting it, it work like a charm)

MatthewLuiHK
  • 691
  • 4
  • 10
0

It works when there is no file at outputURL for AVAssetWriter.

extension FileManager {
    func removeItemIfExist(at url: URL) {
        do {
            if FileManager.default.fileExists(atPath: url.path) {
                try FileManager.default.removeItem(at: url)
            }
        } catch {
            fatalError("\(error)")
        }
    }
}

Usage

let assetWriter = try? AVAssetWriter(outputURL: outputURL, fileType: .mov)
FileManager.default.removeItemIfExist(at: outputURL)
// do something
Changnam Hong
  • 1,669
  • 18
  • 29
0

I got it all working! With the options dictionary set to compatibility, they say its possible to use the Buffer pool, Here is working samples and Code for writing without the buffer, but its a good place to start.

Here is the sample code link

Here is the code you need:

- (void) testCompressionSession
{
CGSize size = CGSizeMake(480, 320);


NSString *betaCompressionDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];

NSError *error = nil;

unlink([betaCompressionDirectory UTF8String]);

//----initialize compression engine
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory]
                                                       fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];
NSParameterAssert(videoWriter);
if(error)
    NSLog(@"error = %@", [error localizedDescription]);

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey, nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];

NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
                                                       [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                                                                                 sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);

if ([videoWriter canAddInput:writerInput])
    NSLog(@"I can add this input");
else
    NSLog(@"i can't add this input");

[videoWriter addInput:writerInput];

[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

//---
// insert demo debugging code to write the same image repeated as a movie

CGImageRef theImage = [[UIImage imageNamed:@"Lotus.png"] CGImage];

dispatch_queue_t    dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);
int __block         frame = 0;

[writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{
    while ([writerInput isReadyForMoreMediaData])
    {
        if(++frame >= 120)
        {
            [writerInput markAsFinished];
            [videoWriter finishWriting];
            [videoWriter release];
            break;
        }

        CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:theImage size:size];
        if (buffer)
        {
            if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, 20)])
                NSLog(@"FAIL");
            else
                NSLog(@"Success:%d", frame);
            CFRelease(buffer);
        }
    }
}];

NSLog(@"outside for loop");

}


- (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);
// CVReturn status = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pxbuffer);

NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);

CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);

CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

return pxbuffer;
}
Jasper
  • 7,031
  • 3
  • 35
  • 43
Orbitus007
  • 685
  • 6
  • 12
  • Could you explain which pixelBuffer this solution is supposed to use. – HepaKKes May 14 '14 at 14:24
  • 3
    Apple should fire the whole department that writes their awful, appalling, horrible and disgusting incomplete, vague, misleading documentations. They are a disgrace. – Duck Dec 18 '16 at 03:35