Questions tagged [cmsamplebuffer]

106 questions
1
vote
3 answers

Getting Exposure Time (EXIF) from CMSampleBuffer

I am trying to get the exposure time from an image captured using AVFoundation. When I followed the 2010's WWDC instruction about retrieving useful image metadata from CMSampleBuffer like…
yonasstephen
  • 2,645
  • 4
  • 23
  • 48
1
vote
0 answers

Need timestamps of each frame captured in AVCaptureSession on AVCam

What I need, is to be able to edit AVCam so that I can store, access or view somehow the timestamps of each frame in a short video captured. After some research, I think it can be done but unsure how. I have seen in Xcode…
1
vote
0 answers

Does captureOutput:didOutputSampleBuffer:fromConnection: carry any orientation information?

I apply tons of image processing to a camera live preview layer via OpenGL, and I want to get some information about the input`s orientation before I ask for the image from OpenGL (to apply corresponding transformations). The capturing takes place…
Geri Borbás
  • 15,810
  • 18
  • 109
  • 172
0
votes
1 answer

CVPixelBuffer with audio

I am using AVFoundation to capture CMSampleBufferRef from the camera and then convert it into CVPixelBufferRef to write to the video. What I want to do is to modify some pixel inside the video frames. That's why I need to get the CVPixelBufferRef…
vodkhang
  • 18,639
  • 11
  • 76
  • 110
0
votes
0 answers

Using the dav1d framework, rendering on iOS

trying to render using open source dav1d that can decode AV1 images. Proceed in the order below YUV420I -> CVPixelBufferRef -> CMSampleBufferRef -> AVSampleBufferDisplayLayer -> enqueueSampleBuffer Error A is occurring. 2023-07-26…
0
votes
1 answer

Is it possible render video frames directly to AirPlay from an iOS app?

Is it possible for me to generate CMSampleBuffers in real time and stream that to a TV via AirPlay. Similarly to how I am able to render those frames directly to an AVSampleBufferDisplayLayer on the iOS device. I know that I can use AirPlay with the…
Anton
  • 978
  • 8
  • 16
0
votes
0 answers

What exactly does CVBufferSetAttachment do?

I understand the CVBufferSetAttachment simply appends metadata attachment to the sample buffer in a dictionary. But I see there are no errors in appending metadata that is contradictory in nature. For instance, for the sample buffers received from…
Deepak Sharma
  • 5,577
  • 7
  • 55
  • 131
0
votes
0 answers

Unknown error when appending AVAssetWriterInput

I'm trying to convert AVAudioPCMBuffer I got from AVAudioNode's tap block into CMSampleBuffer to append audio input of AVAssetWriter. I'm also creating a new sample buffer with the correct timing - delta of the buffer's time and the time asset…
0
votes
0 answers

Understanding the role of time in a AVCaptureSession regarding CMSampleBuffers

I recently started programming in Swift as I am trying to work out an iOS camera app idea I've had. The main goal of the project is to save the prior 10 seconds of video before the record button is tapped. So the app is actually always capturing and…
0
votes
1 answer

iOS PictureInPicture with AVSampleBufferDisplayLayer seek forward disabled

I have an AvPictureInPictureController that should display an image while playing audio. I have created a AVSampleBufferDisplayLayer that has a CMSampleBuffer with an image and the two necessary delegates…
ksysha
  • 312
  • 6
  • 8
0
votes
0 answers

AudioBufferList with two Buffers

I am receiving this ABSD from CMSampleBuffer which comes from a broadcasting session from the ReplayKit in iOS. mFormatID = kAudioFormatLinearPCM, mFormatFlags = 14, mChannelsPerFrame = 2, mBytesPerPacket = 4 mFramesPerPacket = 1 mBytesPerFrame = …
0
votes
0 answers

How to match RPScreenRecorder screenshot times to application logs

I'm trying to align the screenshots emitted by RPScreenRecorder's startCapture method to logs saved elsewhere in my code. I was hoping that I could just match CMSampleBuffer's presentationTimeStamp to the timestamp reported by…
ryanipete
  • 419
  • 5
  • 15
0
votes
0 answers

Force GPU usage with copying video sample buffers via AVAssetReader/AVSampleBufferDisplayLayer

I am looping 1 second mp4/h264 videos with no audio on an M1 Mac Mini. AVPlayer was causing hitches with scrolling. Now I read videos using AVAssetReader and feed those CMSampleBuffers into a AVSampleBufferDisplayLayer. To get it to seamlessly loop…
jnorris
  • 51
  • 1
  • 6
0
votes
0 answers

convert CMSampleBuffer to m3u8 and save it to path?

Im using replaykit and broadcast upload extension to get devices screen recording. override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { switch sampleBufferType { case .video: …
Sam KC
  • 61
  • 1
  • 8
0
votes
1 answer

Presentation time of Audio buffer and Video buffer are not equal

I'm trying to create an app that does live Video & Audio recording, using AVFoundation. Also using AVAssetWriter I'm writing the buffers to a local file. For the Video CMSampleBuffer I'm Using the AVCaptureVideoDataOutputSampleBufferDelegate output…
YYfim
  • 1,402
  • 1
  • 9
  • 24