Questions tagged [cvpixelbuffer]

159 questions
3
votes
3 answers

AVAssetWriterInput append fails with error code -11800 AVErrorUnknown -12780

I am trying to capture camera video in memory using AVCaptureSession so that I can later write the video data to a movie file. While I have been able to successfully start a capture session, I am not able to successful write the CMSampleBuffers I've…
3
votes
0 answers

Is it possible to use a PixelBufferPool with AVPlayerItemVideoOutput.copyPixelBuffer?

From my recent experience, using a PixelBufferPool to store pixel buffers instead of creating new ones all the time can yield quite strong performance improvement. In my app I'm reading buffers from a video using copyPixelBuffer. From the doc it…
Guig
  • 9,891
  • 7
  • 64
  • 126
3
votes
0 answers

How to create several mp4 files with AVAssetWriter at the same time

I try to save four video streams with AVAssetWriter on the iPhone as .mp4. With three streams everything works fine, but the 4th mp4 file is always empty. Here is a piece of my code: -(void)writeImagesToMovie:(CVPixelBufferRef) buffer :(int)…
3
votes
0 answers

invalid data bytes/row:CGBitmapContextCreate: CGContextDrawImage: invalid context 0x0

I'm trying to convert array of images into a video file. In the process I have to fill pixel buffer from selected images. Here is the code snippet: CVPixelBufferLockBaseAddress(pixelBuffer, 0) let pixelData =…
3
votes
1 answer

I want to convert a CIImage to UIImage but failed

I have a CIImage with format kCVPixelFormatType_420YpCbCr8BiPlanarFullRange(YCC420f or N12), now I want to see what the image looks like, so I convert it to UIImage use this method: CIImage *ciImage = [CIImage…
Klein Mioke
  • 1,261
  • 2
  • 14
  • 23
3
votes
1 answer

The continuance of YUV-NV12 video buffer's y plane and uv plane

I've got a (CMSampleBufferRef)imageBuffer which is type of yuv_nv12(4:2:0). Now I run the following code,and find the result is confusing. UInt8 *baseSRC = (UInt8 *)CVPixelBufferGetBaseAddress(imageBuffer); UInt8 *yBaseSRC = (UInt8…
Li Fumin
  • 1,383
  • 2
  • 15
  • 31
3
votes
1 answer

Why won't AVFoundation accept my planar pixel buffers on an iOS device?

I've been struggling to figure out what the problem is with my code. I'm creating a planar CVPixelBufferRef to write to an AVAssetWriter. This pixel buffer is created manually through some other process (i.e., I'm not getting these samples from the…
CIFilter
  • 8,647
  • 4
  • 46
  • 66
3
votes
0 answers

SpriteKit -- can I get underlying pixel buffers?

I want to get access to underlying OpenGL context of a SpriteKit scene so I can do a glReadPixels at 30/60fps on it in order to obtain a RGB32 pixel buffer image representation of the scene, as it's rendered. Ideally, I'd love access to something…
zzyzy
  • 973
  • 6
  • 21
2
votes
0 answers

how to convert CIFilter output to CMSampleBuffer

I want to add some filter to CMSampleBuffer using CIFilter, then convert it back to CMSampleBuffer I have some filter like this: let filter = YUCIHighPassSkinSmoothing() filter.inputImage = CIImage(cvImageBuffer: pixelBufferFromCMSampleBuffer) …
famfamfam
  • 396
  • 3
  • 8
  • 31
2
votes
0 answers

Memory leak converting CVPixelBuffer to UIImage

These days I am encountering a problem in relation to memory usage in my app that rarely causes this one to crash. I noticed from the memory inspector that it is caused by converting the CVPixelBuffer (of my camera) to UIimage. Below you can see the…
2
votes
0 answers

Merge CAShapeLayer into CVPixelBuffer

I'm capturing the output of a playing video using AVPlayerItemVideoOutput.copyPixelBuffer I'm able to convert the pixel buffer into a CIImage, then render it back into a pixel buffer again, and then an AVAssetWriter writes the buffer stream out to a…
Steve Macdonald
  • 1,745
  • 2
  • 20
  • 34
2
votes
2 answers

LiDAR Depth + Vision Hand Tracking for 3D Hand Tracking

I want to use Vision 2D Hand Tracking input coupled with ARKit > People Occlusion > Body Segmentation With Depth, which leverage LiDAR, to get 3D World Coordinates of the tip of the index. Steps I am doing: 1 - The 2D screen location of the finger…
Oscar Falmer
  • 1,771
  • 1
  • 24
  • 38
2
votes
0 answers

iOS camera cpu usage

I've written my custom camera. I get a pixelBuffer from AVCaptureVideoDataOutput delegate, create MTLTexture from it and display using Metal. I've got ~ 20% CPU usage. Apple's public project AVCamFilter built on the same principle have the same CPU…
2
votes
0 answers

Deep copy CVPixelBuffer for depth data in Swift

I'm getting a stream of depth data from AVCaptureSynchronizedDataCollection and trying to do some processing on the depthDataMap asynchronously. I tried to deep copy the CVPixelBuffer since I don't want to block the camera while processing, but it…
2
votes
2 answers

Do I have to lock a CVPixelBuffer produced from AVCaptureVideoDataOutput

I have a AVCaptureVideoDataOutput producing CMSampleBuffer instances passed into my AVCaptureVideoDataOutputSampleBufferDelegate function. I want to efficiently convert the pixel buffers into CGImage instances for usage elsewhere in my app. I have…
willbattel
  • 1,040
  • 1
  • 10
  • 37