Questions tagged [cmsamplebufferref]
71 questions
1
vote
1 answer
Dispatch_async in captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer, EXC_BAD_ACCESS error on sample buffer
I am trying to get the sample buffer from captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer, process it and then append it to an AVAssetWriter. The whole code works, however it gets really slow and I get low fps on older devices.…

reapf
- 89
- 6
1
vote
0 answers
How to increase the speed of audio in CMSampleBuffer
What is the process to speed up audio recieved via CMSampleBuffer so that is recorded at 2X speed.

Abhiraj Kumar
- 160
- 1
- 6
1
vote
1 answer
How can we use AVSampleBufferDisplayLayer to render CMSampleBufferRef?
I have this delegate method
-(void)airPlayServer:(id)server sampleBufferReceived:(CMSampleBufferRef)sampleBuffer
{
}
which gives me sampleBuffer.
Now I need to know how can I use AVSampleBufferDisplayLayer to render my sampleBuffer. I know we have…

Abid Mehmood
- 145
- 1
- 9
1
vote
1 answer
CMBlockBuffer ownership in CMSampleBuffer
I'm writing code to decompress a native annex-B H.264 stream, and I'm going through the process of parsing the stream, creating a CMVideoFormatDescription from the SPS/PPS NALUs, and wrapping the other NALUs I extract from the stream in…

Jim B.
- 4,512
- 3
- 25
- 53
1
vote
0 answers
CMSampleBufferRef parameter in delegate callback is empty
I am using this code, to create AVCaptureSession, but when -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection is called, sampleBuffer size is…

Heisenbug
- 951
- 6
- 25
1
vote
1 answer
Resizing CMSampleBufferRef provided by captureStillImageBracketAsynchronouslyFromConnection:withSettingsArray:completionHandler:
In the app I'm working on, we're capturing photos which need to have 4:3 aspect ratio in order to maximize the field of view we capture. Up untill now we were using AVCaptureSessionPreset640x480 preset, but now we're in need of larger resolution.
As…

Ilija
- 151
- 11
1
vote
2 answers
Creating `CMSampleBufferRef` from a .mov file
My app captures a video clip for 3 seconds, and programmatically I want to create a 15 sec clip from recorded 3 sec clip by looping it 5 times. And finally have to save 15 sec clip in CameraRoll.
I have got my 3 sec video clip via…

itsji10dra
- 4,603
- 3
- 39
- 59
1
vote
1 answer
Video stream in AVSampleBufferDisplayLayer doesn't show up in screenshot
I've been using the new Video Toolbox methods to take an H.264 video stream and display it in a view controller using AVSampleBufferDisplayLayer. This all works as intended and the stream looks great. However, when I try to take a screenshot of…

Olivia Stork
- 4,660
- 5
- 27
- 40
1
vote
2 answers
iOS memory building up while creating UIImage from CMSampleBufferRef
I'm creating UIImage objects from CMSampleBufferRef's. I'm doing this in a separate queue (in background) so I'm including the processing in an @autorealease pool. The problem is that memory is building up without any leak notification. Bellow is…

Mihai
- 768
- 2
- 6
- 18
1
vote
1 answer
Converting CVImageBufferRef YUV 420 to cv::Mat RGB and displaying it in a CALayer?
Once that I can't get successfully anything different of YCbCr 420 from the camera (https://stackoverflow.com/questions/19673770/objective-c-avcapturevideodataoutput-videosettings-how-to-identify-which-pix)
So, my goal is to display this…

Eduardo Reis
- 1,691
- 1
- 22
- 45
1
vote
0 answers
PixelBuffer not being freed when using AVAssetWriter
I'm using an AVAssetWriter with a single video input to write video frames to a movie.
The mechanics of the write loop aren't a problem: but I'm finding memory management is.
With regard to the CMSampleBufferRef's I'm appending:
a) I create a…

Neil Clayton
- 100
- 9
0
votes
1 answer
How to implement a CMSampleBuffer for MLkit facial detection?
Basically, I'm trying to create a simple real-time facial recognition IOS app that streams the users face and tells them whether their eyes are closed. I'm following the google tutorial here -…

Austin R.
- 25
- 5
0
votes
1 answer
iOS Broadcast extension crash randomly when I try to convert buffer samples to UIImage
I am trying to develop broadcast extension to make screen recording. My path is get CMSampleBufferRef from extension, check the type if it is video convert to UIImage and then share with container app via MMWormhole. So far I can able to do it. But…

kutay
- 11
- 4
0
votes
1 answer
Image from CMSampleBufferRef is always white
I am trying to get each frame from the replaykit using startCaptureWithHandler.
startCaptureWithHandler returns a CMSampleBufferRef which i need to convert to an image.
Im using this method to convert to UIImage but its always white.
- (UIImage *)…

Marwan Harb
- 31
- 1
0
votes
0 answers
Using CIImage or CGImage as direct input videotoolbox
I'm try to make a feature that apply filter frame capture from camera and stream via network.
Because of bad performance while apply filter in CMSampleBuffer or CVPixelBuffer (CPU side), i try to convert data from CMSampleBuffer (raw from camera…

Hanh Nguyen
- 1
- 1