0

I have created a ReplayKit Broadcast Extension, so the maximum amount of memory I can use is 50 MB.

I am taking samples of the broadcasted stream to send those images with a CFMessagePortSendRequest call. As that function accepts only CFData type, I need to convert my multi-plane image to Data.

NSKeyedArchiver.archivedObject() seems to exceed this 50 MB. Breaking on the line before the call I can see a memory consumption of ~6 MB. Then, executing the archivedObject call, my extension crashes cause it exceeds the memory limit.

Is there a less memory-eating way to convert the CIImage of a CVPixelBuffer to Data? And then back, of course.

iSpain17
  • 2,502
  • 3
  • 17
  • 26

1 Answers1

1

I was able convert CMSampleBufferRef to NSData in following way. This method is using like 1-5~MB ram. I hope this will solve your problem.

CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBufferType);
UInt8* bap0 = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0); 
CVPixelBufferLockBaseAddress(imageBuffer,0);

int byteperrow = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, 0);
int height = CVPixelBufferGetHeight(imageBuffer);
NSData *data = [NSData dataWithBytes:bap0 length:byteperrow * height];
kutay
  • 11
  • 4
  • As mentioned in the question, I have multi-plane image, but yes, this is the way I had to go, but had to read all planes' bytes. – iSpain17 Feb 13 '20 at 08:13