Questions tagged [cmsamplebufferref]

71 questions
5
votes
1 answer

Crop CMSampleBufferRef

I am trying to crop image in CMSampleBufferRef to a specific size. I am making 5 steps - 1. Getting PixelBuffer from SampleBuffer 2. Converting PixelBuffer to CIImage 3. Cropping CIImage 4. Rendering CIImage back to PixelBuffer 5. Attaching…
Laz
  • 538
  • 7
  • 12
4
votes
1 answer

iOS - Automatically resize CVPixelBufferRef

I am trying to crop and scale a CMSampleBufferRef based on user's inputs, on ratio, the below code takes a CMSampleBufferRef, convert it into a CVImageBufferRef and use CVPixelBuffer to crop the internal image based on its bytes. The goal of this…
vodkhang
  • 18,639
  • 11
  • 76
  • 110
4
votes
2 answers

Saving CMSampleBufferRef for later processing

I am trying to use AVFoundation framework to capture a 'series' of still images from AVCaptureStillImageOutput QUICKLY, like the burst mode in some cameras. I want to use the completion handler, [stillImageOutput…
NSRover
  • 932
  • 1
  • 12
  • 29
4
votes
1 answer

How many frames in CMSampleBuffer?

This might be a dumb question, but I am just starting to learn about media formats and AVFoundation, so bear with me. I've been trying to figure out whether a CMSampleBuffer from AVCaptureVideoDataOutput can have more than one frame in it. From the…
wciu
  • 1,183
  • 2
  • 9
  • 24
4
votes
1 answer

How to use RawDataInput in GPUImage2

Im using a media capture library called NextLevel which spits out a CMSampleBuffer on each frame. I want to take this buffer and feed it to GPUImage2 thru a rawDataInput and pass it over some filters and read it back from a rawDataOutput at the end…
omarojo
  • 1,197
  • 1
  • 13
  • 26
4
votes
2 answers

Understanding Metal usage with AVCaptureSession video output

I'm trying to understand what is the right way to manipulate Video output(CMPixelBuffer), using Metal. As far as i understand there is MTKView. Each CMPixelBuffer, from the video output, is being assigned to some what of Metal Texture. So the final…
Roi Mulia
  • 5,626
  • 11
  • 54
  • 105
4
votes
1 answer

Square video output in iOS

Is there a way to get square video output by AVFoundation in iOS? I use OpenGL for processing every frame(CMSampleBuffer) of video. Every frame is rotated, so I need to crop and rotate CMSampleBuffer. But I don't know how to do that, so I believe…
vkalit
  • 647
  • 8
  • 19
4
votes
1 answer

AVAssetReader/AVAssetWriter preview of current frame

I'm using AVAssetReader/AVAssetWriter to convert my video on iOS. My question is: what's the most efficient way to show preview of current frame with real time conversion. I was thinking about converting CMSampleBufferRef to UIImage, then applying…
user3128673
  • 663
  • 1
  • 5
  • 6
4
votes
3 answers

Set rate at which AVSampleBufferDisplayLayer renders sample buffers

I am using an AVSampleBufferDisplayLayer to display CMSampleBuffers which are coming over a network connection in the h.264 format. Video playback is smooth and working correctly, however I cannot seem to control the frame rate. Specifically, if I…
4
votes
1 answer

Converting AudioBuffer to CMSampleBuffer with accurate CMTime

The goal here is to create a mp4 file via video through AVCaptureDataOutput and audio recorded a CoreAudio. Then send the CMSampleBuffers of both to an AVAssetWriter who has accompanying AVAssetWriterInput(AVMediaTypeVideo) and…
4
votes
3 answers

Convert a CMSampleBuffer into a UIImage

Here's a function (code from Apple documentation) that converts a CMSampleBuffer into a UIImage func imageFromSampleBuffer(sampleBuffer: CMSampleBuffer) -> UIImage { // Get a CMSampleBuffer's Core Video image buffer for the media data var…
Joe
  • 2,386
  • 1
  • 22
  • 33
4
votes
1 answer

Core Image - rendering a transparent image on CMSampleBufferRef result in black box around it

I'm trying to add a watermark/logo on a video that I'm recording using AVFoundation's AVCaptureVideoDataOutput. My class is set as the sampleBufferDelegate and receives the CMSamplebufferRefs. I already apply some effects to the CMSampleBufferRefs…
Joris Timmerman
  • 1,482
  • 14
  • 25
4
votes
1 answer

Convert NSData to CMSampleBufferRef

Is there any way to create CMSampleBufferRef from NSData? NSData object has also be constructed from CMSampleBufferRef formerly. I need that conversion because I want to save the CMSampleBufferRef frames (as NSData) that are taken from live camera…
crazywood
  • 1,455
  • 1
  • 10
  • 17
4
votes
0 answers

how to convert audio CMSampleBufferRef to data, and play the audio in other device with data received

I try to implement a video conference, i can capture video and audio with CaputreSession, and covert video CMSampleBufferRef to image and send data to other device. But i don't have any idea how i can convert audio CMSampleBufferRef to data, and…
3
votes
1 answer

AVAssetWriter not recording audio

Im having trouble getting audio recorded into a video using avassetwriter on the iPhone. I am able to record video from the camera on the phone no problem but when I try to add audio I get nothing, also the durations displayed in the video in the…