I got the wrong pixel values between Swift iOS and OpenCV Python.
In Python, I read video with video.read()
and got BGR 79, 82, 90
of the first pixel.
In Swift, I read the video by AVURLAsset
and I got BGR 80, 84, 92
.
It seems quite the same, but maybe there's some rounding here, isn't there? How can I deal with it? I really appreciate your help.
Here is some code to reproduce:
Python:
import cv2
vidcap = cv2.VideoCapture('test_video.mp4')
success, image = vidcap.read()
image[0][0] # output is [79, 82, 90]
Swift:
DispatchQueue.global().async(execute: {
while reader?.status == AVAssetReader.Status.reading {
if(rout.copyNextSampleBuffer() != nil){
// Buffer of the frame to perform processing on
let sampleBuffer: CMSampleBuffer = rout.copyNextSampleBuffer()!
let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
}
usleep(10000)
}
})
Extract pixel from first pixelBuffer
and I got BGR [80, 84, 92]