0

I am trying to convert raw camera sensor data to a color image. The data are firstly provided in a [UInt16] array and subsequently converted to a cvPixelBuffer.

The following Swift 5 code "only" creates a black-and-white image and disregards the color filter array of the RGGB pixel data.

I also tried VTCreateCGImageFromCVPixelBuffer to no avail. It returns nil.

   // imgRawData is a [UInt16] array

   var pixelBuffer: CVPixelBuffer?
   let attrs: [ CFString : Any ]  = [ kCVPixelBufferPixelFormatTypeKey : "rgg4" ]

   CVPixelBufferCreateWithBytes(kCFAllocatorDefault, width, height, kCVPixelFormatType_14Bayer_RGGB, &imgRawData, 2*width, nil, nil, attrs as CFDictionary, &pixelBuffer)
   
   // This creates a black-and-white image, not color

   let ciimg = CIImage(cvPixelBuffer: pixelBuffer! )

   let context: CIContext = CIContext.init(options: [ CIContextOption(rawValue: "workingFormat") : CIFormat.RGBA16 ] )
   guard let cgi = context.createCGImage(ciimg, from: ciimg.extent, format: CIFormat.RGBA16, colorSpace: CGColorSpace(name: CGColorSpace.sRGB), deferred: false)
    else { return dummyImg! }


   // This function returns nil
   var cgI: CGImage?
   VTCreateCGImageFromCVPixelBuffer(pixelBuffer!, options: nil, imageOut: &cgI)

Any hint is highly appreciated.

As for the demosaicing, I want CoreImage or CoreGraphics take care of the RGGB pixel-color interpolation.

  • Have you tried using `CIFilter` for RAW processing? See here: https://developer.apple.com/documentation/coreimage/cifilter/1437879-init – Frank Rupprecht Mar 01 '21 at 11:16
  • Thank you for the hint... this had been actually my very first approach, but either I get an image with extent nan with the original data or I do not know what properties to set for the pixelbuffer: `let cirfo = [ CIRAWFilterOption(rawValue: String(kCGImageSourceTypeIdentifierHint)) : "com.sony.raw-image" ]` `let ciRawFilter : CIFilter = CIFilter(imageData: myData , options: cirfo)` - `let ciPBRawFilter : CIFilter = CIFilter(cvPixelBuffer: pixelBuffer, properties: ??? , options: cirfo)` – Planetoid30 Mar 01 '21 at 20:57
  • ... and then of course this command: `let ciiraw = ciRawFilter.outputImage`. It returns (lldb) po ciiraw ▿ Optional - some : tagcolorspace AdobeRGB1998 extent=[null][nan nan nan nan] affine hiQ [-inf 0 0 -inf 0 0] extent=[null][nan nan nan nan] fill clear extent=[null][0 0 1 1] – Planetoid30 Mar 01 '21 at 21:07
  • @FrankSchlegel, how can I let CIFilter "know" that these are actual raw pixel data? Do you have any sample code, besides what is "available" on the Apple Developer site? I would appreciate your help. Thank you. – Planetoid30 Mar 02 '21 at 09:25
  • Hi Frank, this is what I realized and how I understand I need to solve this: The CFA layout has to be provided as "tagged info" in the raw data for CIRAWFilter/CIFILter. Raw image data are tagged! This is what I had been missing before. TIFF is a tagged format and is also used for the raw images. I would need to convert the image sensor properties to "tags" and then add the tagged data to the raw image data in a single data variable, before I read and convert them with CIRAWFILTER to a CI image. – Planetoid30 Jul 09 '22 at 17:05

0 Answers0