Questions tagged [cmsamplebuffer]
106 questions
5
votes
1 answer
How to properly address the pixels in a CVPixelBuffer?
The short question is: What's the formula to address pixel values in a CVPixelBuffer?
I'm trying to convert a CVPixelBuffer in a flat byte array and noticed a few odd things: The CVPixelBuffer is obtained from a CMSampleBuffer. It's width and…

SePröbläm
- 5,142
- 6
- 31
- 45
5
votes
3 answers
Creating copy of CMSampleBuffer in Swift returns OSStatus -12743 (Invalid Media Format)
I am attempting to perform a deep clone of CMSampleBuffer to store the output of a AVCaptureSession. I am receiving the error kCMSampleBufferError_InvalidMediaFormat (OSStatus -12743) when I run the function CMSampleBufferCreateForImageBuffer. I…

Rob
- 4,149
- 5
- 34
- 48
5
votes
1 answer
CMSampleBuffer to byte-array in Swift
I'm trying to implement a vidoestream for a multipeer connectivity app. The captured frame will be compressed by VTCompressionSession and my callback is being called.
Now my CMSamplebuffer contains a CMBlockBuffer and i could extract the NALUs etc.…

JeWol
- 51
- 4
4
votes
0 answers
Convert from CMSampleBuffer to NSData image without exceeding 50MB of memory
Error exceeding memory allowance when processing CMSampleBuffer to NSData conversion in RPBroadcastSampleHandler.
Error code: Thread 7: EXC_RESOURCE RESOURCE_TYPE_MEMORY (limit = 50 MB, unused = 0x0)
I am processing the function of converting a…

thachonline
- 51
- 4
4
votes
1 answer
Appropriately Release Buffers From AVCaptureDataOutputSynchronizerDelegate Due to Out Of Buffers
I am using AVCaptureDataOutputSynchronizerDelegate to handle capturing data for video, depth and metadata
private let videoDataOutput = AVCaptureVideoDataOutput()
private let depthDataOutput = AVCaptureDepthDataOutput()
private let…

impression7vx
- 1,728
- 1
- 20
- 50
4
votes
1 answer
When reading frames from a video on an iPad with AVAssetReader, the images are not properly oriented
A few things I want to establish first:
This works properly on multiple iPhones (iOS 10.3 & 11.x)
This works properly on any iPad simulator (iOS 11.x)
What I am left with is a situation where when I run the following code (condensed from my…

CodeBender
- 35,668
- 12
- 125
- 132
4
votes
1 answer
iOS: convert UIImage to CMSampleBuffer
Some questions address how to convert a CMSampleBuffer to a UIImage, but there are no answers on how to do the reverse, i.e., convert UIImage to CMSampleBuffer.
This question is different from similar ones because the code below provides a starting…

Crashalot
- 33,605
- 61
- 269
- 439
4
votes
1 answer
CIContext render: toCVPixelBuffer: bounds: colorSpace: function does not work for images with alpha channel
I'm trying to add a watermark/logo on a video that I'm recording using AVFoundation's AVCaptureVideoDataOutput.The problem I'm having is that the transparent parts of the UIImage are black once written to the video.What I'm doing wrong…

M. Arman
- 43
- 5
4
votes
3 answers
Set rate at which AVSampleBufferDisplayLayer renders sample buffers
I am using an AVSampleBufferDisplayLayer to display CMSampleBuffers which are coming over a network connection in the h.264 format. Video playback is smooth and working correctly, however I cannot seem to control the frame rate. Specifically, if I…

Amos Joshua
- 1,601
- 18
- 25
4
votes
2 answers
Adding filters to video with AVFoundation (OSX) - how do I write the resulting image back to AVWriter?
Setting the scene
I am working on a video processing app that runs from the command line to read in, process and then export video. I'm working with 4 tracks.
Lots of clips that I append into a single track to make one video. Let's call this the…

Tim Bull
- 2,375
- 21
- 25
3
votes
1 answer
How to save encoded CMSampleBuffer samples to mp4 file on iOS
I'm using the VideoToolbox framework to retrieve data from AVCaptureSession and encode it to h264 and acc.
I'm at a point where I:
obtain data using the delegate method func captureOutput(_ captureOutput: AVCaptureOutput, didOutput sampleBuffer:…

Grzegorz Aperliński
- 848
- 1
- 10
- 23
3
votes
2 answers
How to manually release CMSampleBuffer
This code leads to memory leak and app crash:
var outputSamples = [Float]()
assetReader.startReading()
while assetReader.status == .reading {
let trackOutput = assetReader.outputs.first!
if let sampleBuffer =…

DmitryoN
- 310
- 2
- 11
3
votes
2 answers
Value of type 'CMSampleBuffer' has no member 'imageBuffer'
I am currently working on a project that uses a live camera view in Swift. I used some code I found on GitHub to give me that live camera view, and it works great on my MacBook Pro running Mojave. I have all my files stored on an external HDD, so I…

Thijs van der Heijden
- 1,147
- 1
- 10
- 25
3
votes
1 answer
How to get ImageBuffer correctly in Swift now with CMSampleBufferGetImageBuffer?
'CMSampleBufferGetImageBuffer' has been replaced by property
'CMSampleBuffer.imageBuffer'
CMSampleBufferGet.ImageBuffer doesn't work :) It seems parameters also being changed regarding to Swift 4.2.
guard let pixelBuffer: CVPixelBuffer =…

J A S K I E R
- 1,976
- 3
- 24
- 42
3
votes
0 answers
Any way to convert UIImage to CMSampleBuffer on Swift?
I'm using ABBYY OCR SDK and it's great at text recognition direct from camera from CMSampleBuffer in:
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!)…

Dmitrii Z
- 145
- 1
- 5