Questions tagged [cvpixelbuffer]

159 questions
0
votes
1 answer

swift CVPixelBuffer video is darker than original image

My app takes a snapshot of view: ZStack ( image with opacity 0.4 , white rectangle with opacity 0.25 then text) and save it as a image then enables user to generate a video using that image and some audio, I followed…
user2814778
  • 296
  • 6
  • 14
0
votes
0 answers

Video frames appear to be rendering with incorrect format using WebRTC

In my project, I use the Zoom Video SDK to receive frames in a YUV 420i format. I'm doing an experiment trying to render these frames using the WebRTC RTCMTLNSVideoView, but I seem to have a format conversion issue, as the frames rendered on the…
jnpdx
  • 45,847
  • 6
  • 64
  • 94
0
votes
0 answers

How to get specific pixel color from CVPixelBuffer real time camera

I am using CVPixelBuffer as data from camera, and I wanted to get specific pixel e.g.(0,100) and its corresponding colour as RGB. I have tried multiple ways such as retrieving data by the Base Address and the code as below: var address =…
7wiCey
  • 3
  • 2
0
votes
1 answer

objective-c crop vImage PixelBuffer

How to access the existing crop capabilities of vImage that are only documented for swift, but for objective-c? https://developer.apple.com/documentation/accelerate/vimage/pixelbuffer/3951652-cropped?changes=_7_1&language=objc just for linkage, i…
cguenther
  • 1,579
  • 1
  • 10
  • 14
0
votes
2 answers

How to achieve better performance in converting UIView to CVPixelBuffer?

I'm wondering if it's possible to achieve a better performance in converting the UIView into CVPixelBuffer. My app converts a sequence of UIViews first into UIImages and then into CVPixelBuffers as shown below. In the end, I record all these…
aibek
  • 161
  • 8
0
votes
0 answers

How to match RPScreenRecorder screenshot times to application logs

I'm trying to align the screenshots emitted by RPScreenRecorder's startCapture method to logs saved elsewhere in my code. I was hoping that I could just match CMSampleBuffer's presentationTimeStamp to the timestamp reported by…
ryanipete
  • 419
  • 5
  • 15
0
votes
1 answer

Convert YUV data to CVPixelBufferRef and play in AVSampleBufferDisplayLayer

I'm having a stream of video in IYUV (4:2:0) format and trying to convert it into CVPixelBufferRef and then into CMSampleBufferRef and play it in AVSampleBufferDisplayLayer (AVPictureInPictureController required). I've tried several version of…
Tj3n
  • 9,837
  • 2
  • 24
  • 35
0
votes
1 answer

CVPixelBufferRef: Video Buffer and Depth Buffer have different orientations

Right now I'm working with Depth camera at iOS since I want to measure distance to the camera of certain points at the frame. I did all necessary setup in my camera solution and now I have two CVPixelBufferRef in my hands - one with pixel data and…
0
votes
1 answer

300x300 pixel Png image to Byte Array

My Objective is to extract 300x300 pixel frame from a CVImageBuffer (camera stream) and convert it in to a UInt Byte Array. Technically the array size should be 90,000. However I'm getting a much more larger value. Any help would much appreciate to…
danu
  • 1,079
  • 5
  • 16
  • 48
0
votes
1 answer

CVPixelBuffer created from CGImage has different attributes

EDIT: I worked around this issue but to rephrase my question - is it possible to create the CVPixelBuffer to match the CGImage? For example I would like to have 16 bits per component instead of 8. As the title says. For example, when I NSLog my…
jontelang
  • 589
  • 4
  • 17
0
votes
1 answer

Save the depth data with UIImageWriteToSavedPhotosAlbum

I am using ARKit 4 (+MetalKit) with the new iPhone device, and I am trying to access the depth data (from the LiDAR) and save it as the depth map along with the actual RGB image. While I have seen many demos showing it, non showed the code for…
Valeria
  • 1,508
  • 4
  • 20
  • 44
0
votes
0 answers

cvPixelBuffer to CGImage Conversion only Gives Black-White Image

I am trying to convert raw camera sensor data to a color image. The data are firstly provided in a [UInt16] array and subsequently converted to a cvPixelBuffer. The following Swift 5 code "only" creates a black-and-white image and disregards the…
0
votes
1 answer

iOS ARKit how to save ARFrame .capturedImage to file?

I'm trying to save the camera image from ARFrame to file. The image is given as a CVPixelBuffer. I have the following code, which produces an image with a wrong aspect ratio. I tried different ways, including CIFilter to scale down the image, but…
Alex Stone
  • 46,408
  • 55
  • 231
  • 407
0
votes
1 answer

Equivalent of MPSImageLanczosScale before MacOS 10.13

I am using MPSImageLanczosScale to scale image texture (initiated from CVPixelBufferRef) using Metal framework. The issue is MPSImageLanczosScale is only available from 10.13. But my application supports from 10.11. I can not stop supporting the…
prabhu
  • 1,158
  • 1
  • 12
  • 27
0
votes
1 answer

Converting TrueDepth data to grayscale image produces distorted image

I'm getting the depth data from the TrueDepth camera, and converting it to a grayscale image. (I realize I could pass the AVDepthData to a CIImage constructor, however, for testing purposes, I want to make sure my array is populated correctly,…
Senseful
  • 86,719
  • 67
  • 308
  • 465