Questions tagged [cvpixelbuffer]
159 questions
0
votes
0 answers
Manipulate CVPixelbuffer, vImage, CGImage and UIImage for a rotation
I get from AR session, the current frame with: self.sceneView.session.currentFrame?.capturedImage so I get a CVPixelBuffer with my image information.
I followed this link to convert my CVPixelBuffer to CGImage. (I use previously createCGImage method…

A.Ozda
- 39
- 9
0
votes
1 answer
Video filtering code works on iOS, but not on macOS
I am working on a video filter for iOS and macOS, which captures video input from the default camera, applies a filter (MPSImageGaussianBlur), and renders it using MTKView.
It works fine on iOS (13 on iPhone 6s and iPhone 11), but I see just a red…

Satoshi Nakajima
- 1,863
- 18
- 29
0
votes
1 answer
How to iterate over pixels in a YUV NV12 buffer from camera and set color in Obj-c?
I need to iterate over the pixels of a YUV NV12 buffer and set color. I think the conversion for NV12 format should be easy but I can't figure it out. If I could set the top 50x50 pixels at 0,0 to white, I'd be set. Thank you in advance.

znelson
- 919
- 1
- 10
- 24
0
votes
1 answer
Is there a way to utilize Swift commands within a low-level C method call?
I am maintaining a set of data obtained from a AVCaptureSynchronizedData. One of the methods I use modifies the CVPixelBuffers obtained from the AVCaptureSynchronizedData. While modifying. the CVPixelBuffer, I create a copy of the CVPixelBuffer via…

impression7vx
- 1,728
- 1
- 20
- 50
0
votes
1 answer
Change Buffer format of DJIVideoPreviewer
Now I am creating app that uses video frame from the DJI aircraft and run it through tensorlite object detection model.
I managed to get my app to receive the frame from the aircraft.
However, frame type is VPFrameTypeYUV420Planer. I want to…

中山雄貴
- 1
- 1
0
votes
1 answer
Convert pixelBuffer(kCVPixelFormatType_420YpCbCr8Planar) to CIImage
Now I am trying to convert pixelBuffer to CIImage, but fails.
When pixelBuffer type is kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, the following code can be executed without errors.
let sourceImage = CIImage.init(cvPixelBuffer: imageBuffer,…

中山雄貴
- 1
- 1
0
votes
1 answer
What is the difference between CVPixelBufferGetWidth(pixelbuffer) and CVPixelBufferGetWidthOfPlane(pixelbuffer, 0)?
Is there any difference between CVPixelBufferGetWidth(pixelbuffer) and CVPixelBufferGetWidthOfPlane(pixelbuffer, 0) , CVPixelBufferGetHeight(pixelbuffer) and CVPixelBufferGetHeightOfPlane(pixelbuffer, 0) .
I search wiki, it says:
As with most Y′UV…

leizh007
- 71
- 1
- 7
0
votes
1 answer
Does it matter which pixel format type we choose for AVCaptureSession to run CoreML model using Vision?
I am using Apple's sample app Breakfast finder which does detection and it use pixel format type in biplanar yuv format. I changed it to ARGB and its running on my own model trained on Turi create. Now i have no idea changing the pixel format type…

user2096064
- 108
- 1
- 9
0
votes
0 answers
How to convert image to cvPixelImage in coreml Swift 4
I want to convert an image to model image size. I'm getting images using API. here is the code.
let model = GoogLeNetPlaces()
var selectImage: BaseModel!
override func viewDidLoad() {
super.viewDidLoad()
let url = URL(string:…

Prithvi Raj
- 11
- 7
0
votes
0 answers
Why can't I read from more than 12 CVPixelBuffers at a time?
I'm making a video effect on iOS (using Metal) that requires accessing pixel data from the current video frame as well as some number of previous frames. To do this I'm storing pixel buffers in an Array property that behaves like a stack. When…

Ian Pearce
- 143
- 1
- 10
0
votes
0 answers
Getting a runtime exception when giving a coreML model an image
I am trying to learn Apple's CoreML framework, and to do so I have created a very simple CoreML model that will tell whether an image is showing an apple or a banana. To do so, I have an image of an apple in the Assets.xcassets directory and when I…

J. Doe
- 1
- 1
0
votes
1 answer
CMSampleBufferGetImageBuffer return nil for captured JPEG stillImage
I work on capture Mac screen using JPEG format, and then get the pixelBuffer and imageBuffer of the captured JPEG samplebuffer.
But, the pixelBuffer is always nil, while when I convert the JPEG buffer to NSImage, the image can be got and displayed…

onTheWay
- 128
- 1
- 8
0
votes
1 answer
TokBox: 'consumeFrame' crashes when using a modified pixel buffer
I'm trying to modify the pixel buffer from live video feed from AVFoundation to stream through OpenTok's API. But whenever I try to do so and feed it through OpenTok's consumeFrame, it crashes.
I am doing this so I can apply different live video…

thorng
- 93
- 1
- 4
0
votes
0 answers
iOS - AVassetWriter outputting file of 0 size and not giving errors
I'm still very new to swift (and programming) and I'm trying to output the CVPixelbuffer I get from ARFrame to a video in realtime (without the AR stuff on top).
I've set up the AVAssetWriter and Input and on each frame I try to append the…
0
votes
1 answer
Retrieving CVPixelBuffer from AVCapturePhotoDelegate methods
I would like to obtain a pixelBuffer from didFinishProcessingPhoto delegate method but it's nil.
func capturePhoto() {
let format = [AVVideoCodecKey: AVVideoCodecType.jpeg]
let settings = AVCapturePhotoSettings(format: format)
…

mikro098
- 2,173
- 2
- 32
- 48