5

I'm developing a custom flutter plugin where I send flutter image camera to swift and create a UIImage using flutter camera plugin (https://pub.dev/packages/camera). For that, I send the camera image bytes using this method:

startImageStream((CameraImage img) {
    sendFrameBytes(bytesList: img.planes.map((plane) {
        return plane.bytes;
    }).toList(),
)}

Planes contains a single array containing the RGBA bytes of the image. On the swift code, I get the RGBA bytes as NSArray and create a UIImage like this:

func detectFromFrame1(args:NSDictionary, result:FlutterResult){
    var rgbaPlan = args["bytesList"] as! NSArray
    let rgbaTypedData = rgbaPlan[0] as! FlutterStandardTypedData
    let rgbaUint8 = [UInt8](rgbaTypedData.data)
    let data = NSData(bytes: rgbaUint8, length: rgbaUint8.count)
    let uiimage = UIImage(data: data as Data)
    print(uiimage)
}

The problem is rgbaTypedData, rgbaUint8, data are not empty and the created uiimage is always nil, I don't understand where the problem is.

elarcoiris
  • 1,914
  • 4
  • 28
  • 30
Ramzi Omri
  • 51
  • 2

1 Answers1

-1

I have the same issue. A workaround I use is to convert the image to jpg in flutter and get and give the bytes to the iOS / native code. The downside is, that it's slow and not usable for real-time use

Update: Code Sample (Flutter & TFLite package) Packages: https://pub.dev/packages/image and https://pub.dev/packages/tflite

CODE:

     _cameraController.startImageStream((_availableCameraImage)
      {
        imglib.Image img = imglib.Image.fromBytes(_availableCameraImage.planes[0].width, _availableCameraImage.planes[0].height, _availableCameraImage.planes[0].bytes);
         Uint8List imgByte = imglib.encodeJpg(img);
        Tfliteswift.detectObjectOnBinary(binary: _availableCameraImage.planes[0].bytes);
}