I'm processing raw yuv420 biplane frames data which I receive from the network and need to create CVPixelBuffer in order to process it into Core Image as well as writing into disk using AVAssetWriter.
When I try to create a CVPixelBuffer with code below with width such as 120, 240 or 480 it allocates memory and creates a proper CVPixelBuffer with correct bytePerRow values for both planes (for example width 120 produces a value of 120 bytePerRow).
However, when I input a frame with width such as 90, 180 or 360 it produces a erroneous bytePerRow such as 192 bytePerRow for a frame width of 180. This causes issues of drawing later in CoreImage or AVAssetWriter.
Please see code below to create the CVPixelBuffer.
CGSize frameSize = CGSizeMake(180,240);
CVPixelBufferRef pixelBuffer = NULL;
NSDictionary *pixelAttributes = @{(id)kCVPixelBufferIOSurfaceOpenGLESFBOCompatibilityKey : (id)kCFBooleanTrue,
(id)kCVPixelBufferIOSurfaceCoreAnimationCompatibilityKey : (id)kCFBooleanTrue,
(id)kCVPixelBufferIOSurfaceOpenGLESTextureCompatibilityKey : (id)kCFBooleanTrue,
(id)kCVPixelBufferOpenGLESCompatibilityKey: (id)kCFBooleanTrue};
CVReturn result = CVPixelBufferCreate(NULL, frameSize.width, frameSize.height, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, (__bridge CFDictionaryRef _Nullable)(pixelAttributes), &pixelBuffer);
Please note that I cannot use CVPixelBufferCreateWithPlanarBytes which forces me to allocate memory myself and causes a memory leak later on when used with Core Image which is not the subject of this issue.