Questions tagged [cvpixelbuffer]
159 questions
4
votes
1 answer
Correct way to draw/edit a CVPixelBuffer in Swift in iOS
Is there a standard performant way to edit/draw on a CVImageBuffer/CVPixelBuffer in swift?
All the video editing demos I've found online overlay the drawing (rectangles or text) on the screen and don't directly edit the CVPixelBuffer.
UPDATE I tried…

bias
- 1,467
- 3
- 19
- 39
4
votes
0 answers
How to get image data based on ARFaceAnchor or node position?
I'm trying to create a 2D face unwrap image from a live camera feed from an ARKit session.
I see there is another question about mapping an image onto a mesh. My question is different - it is about generating an image (or many smaller images) from…

Alex Stone
- 46,408
- 55
- 231
- 407
4
votes
1 answer
RGB values of CIImage pixel
I want to access the average colour value of a specific area of CVPixelBuffer that I get from ARFrame in real-time. I managed to crop the image, use filter to calculate average colour and after converting to CGImage I get the value from the pixel…

RealUglyDuck
- 338
- 2
- 13
4
votes
1 answer
UIImage obtaining CVPixelBuffer Removes Alpha
The function below takes in an UIImage and returns a CVPixelBuffer from the UIImage but it removes the alpha channel.
class func pixelBufferFromImage(image: UIImage, pixelBufferPool: CVPixelBufferPool, size: CGSize) -> CVPixelBuffer {
var…

impression7vx
- 1,728
- 1
- 20
- 50
4
votes
2 answers
Retrieve the last frame of live camera preview in swift
I have an AR app where the view is constantly showing what the back camera is seeing and sending each frame for analysis to VisionRequest.
When the object was identified, I would like to capture that particular last frame and save it as a regular…

Ayrad
- 3,996
- 8
- 45
- 86
4
votes
0 answers
How to turn a CALayer into a CVPixelBuffer efficiently?
I want to find a way to replace the CALayer function "layer.render(in: context)" , which can more efficiently
if let context = CGContext(data: pixelData, width: Int(contextSize.width), height: Int(contextSize.height), bitsPerComponent: 8,…

coderLee
- 41
- 2
4
votes
2 answers
CMSampleBuffer frame converted to vImage has wrong colors
I’m trying to convert CMSampleBuffer from camera output to vImage and later apply some processing. Unfortunately, even without any further editing, frame I get from buffer has wrong colors:
Implementation (Memory management and errors are not…

Dawid
- 715
- 2
- 10
- 17
4
votes
2 answers
ios / CoreML - The input type is MultiArray when keras model is converted to CoreML
I am trying to train a keras model and convert it to coreML model using keras 1.2.2 and TensorFlow backend. This is for a classification task. The input to CoreML is being shown as MultiArray. I need this to be Image or something like…

skr
- 914
- 3
- 18
- 35
4
votes
2 answers
Resize a CVPixelBuffer
I'm trying to resize a CVPixelBuffer to a size of 128x128. I'm working with one that is 750x750. I'm currently using the CVPixelBuffer to create a new CGImage, which I resize then convert back into a CVPixelBuffer. Here is my code:
func…

enjoysturtles
- 61
- 1
- 6
4
votes
0 answers
create local movie out of `CVPixelBufferRef` which gets trimmed at the beginning after some time
I do have lots of CVPixelBufferRef which I would like to append to a movie in "real time", i.e. I get 50 - 60 CVPixelBufferRef per second (as they are frames) and would like to create a local video out of it.
Even better would be if I could have…

swalkner
- 16,679
- 31
- 123
- 210
4
votes
2 answers
iOS CVPixelBufferCreate leaking memory in swift 2
I'm trying to convert an image into a video, and the right way seems to use a AVAssetWriter with a AVAssetWriterInputPixelBufferAdaptor, and it works well, but it leaks memory.
When I convert the CGImage to a CVPixelBuffer, I call…

Reuben Crimp
- 311
- 4
- 12
4
votes
4 answers
- copyPixelBufferForItemTime:itemTimeForDisplay: null value
The issue I have is when I compile my app with iOS9 sdk when my app try to get a CVPixelBufferRef from an AVPlayerItemVideoOutput with the - copyPixelBufferForItemTime:itemTimeForDisplay: function I get a null value from time to time when the video…

user1241006
- 181
- 2
- 10
3
votes
1 answer
Bounding Box from VNDetectRectangleRequest is not correct size when used as child VC
I am trying to use VNDetectRectangleRequest from Apple's Vision framework to automatically grab a picture of a card. However when I convert the points to draw the rectangle, it is misshapen and does not follow the rectangle is it should. I have been…

user
- 105
- 1
- 11
3
votes
1 answer
How to convert UIImage to CVPixelBufferRef using YUV color space?
I am doing video recording. I need to snapshoot a view to a UIImage, and then convert it to CVPixelBufferRef. And it work fine with RGBA color space. But the CVPixelBufferRef I need should be with YUV color space.
Anyone have any ideas? Thanks.
+…

guojing
- 129
- 14
3
votes
1 answer
Reliable access and modify captured camera frames under SceneKit
I try to add a B&W filter to the camera images of an ARSCNView, then render colored AR objects over it.
I'am almost there with the following code added to the beginning of - (void)renderer:(id)aRenderer…

diviaki
- 428
- 2
- 15