Now I am creating app that uses video frame from the DJI aircraft and run it through tensorlite object detection model.
I managed to get my app to receive the frame from the aircraft.
However, frame type is VPFrameTypeYUV420Planer. I want to receive the frame of VPFrameTypeYUV420SemiPlanar
. It's because I want to create kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
cvPixelBuffer from the frame.
I tried to change videopreviewr property as follows.
DJIVideoPreviewer.instance()?.frameOutputType = VPFrameTypeYUV420SemiPlaner
However, I got error.
I also tried to create kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
cvPixelBuffer from YUV420Planer frame. However, I don't know how to convert chromaR, chormaB to UV.
func createPixelBuffer(fromFrame frame: VideoFrameYUV) -> CVPixelBuffer? {
var initialPixelBuffer: CVPixelBuffer?
let _: CVReturn = CVPixelBufferCreate(kCFAllocatorDefault, Int(frame.width), Int(frame.height), kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, nil, &initialPixelBuffer)
guard let pixelBuffer = initialPixelBuffer,
CVPixelBufferLockBaseAddress(pixelBuffer, []) == kCVReturnSuccess
else {
return nil
}
let yPlaneWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0)
let yPlaneHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0)
let uvPlaneWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 1)
let uvPlaneHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 1)
let yDestination = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0)
memcpy(yDestination, frame.luma, yPlaneWidth * yPlaneHeight)
let uvDestination = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1)
// let chrB = frame.chromaB.pointee
// let chrR = frame.chromaR.pointee
// I don't know how to convert to uv Buffer.
CVPixelBufferUnlockBaseAddress(pixelBuffer, [])
return pixelBuffer
Is there good way to solve this problem?