0

I am trying to detect sign recognition by the use of tfLite model. I have imported the tfLite model and passed the bitmap but when I run the app,Nothing happens. To get where is problem I have added 5 toast but only 1st toast runs successfully. I doesn't getting whats wrong and also not know how to handle output and at least show in toast. The code is as follow:

private void processImage() {
    imageConverter.run();
    rgbFrameBitmap = Bitmap.createBitmap(previewWidth, previewHeight, Bitmap.Config.ARGB_8888);
    rgbFrameBitmap.setPixels(rgbBytes, 0, previewWidth, 0, 0, previewWidth, previewHeight);
    //Do your work here


    try {
        Model model = Model.newInstance(getApplicationContext());

        Toast.makeText(this, "Toast-1", Toast.LENGTH_SHORT).show();
        // Creates inputs for reference.
        TensorBuffer inputFeature0 = TensorBuffer.createFixedSize(new int[]{1, 1, 1, 3}, DataType.UINT8);

        TensorImage tensorImage = new TensorImage(DataType.UINT8);
        tensorImage.load(rgbFrameBitmap);
        ByteBuffer byteBuffer = tensorImage.getBuffer();
        inputFeature0.loadBuffer(byteBuffer);

        Toast.makeText(this, "Toast-2", Toast.LENGTH_SHORT).show();
        // Runs model inference and gets result.
        Model.Outputs outputs = model.process(inputFeature0);
        TensorBuffer outputFeature0 = outputs.getOutputFeature0AsTensorBuffer();
        TensorBuffer outputFeature1 = outputs.getOutputFeature1AsTensorBuffer();
        TensorBuffer outputFeature2 = outputs.getOutputFeature2AsTensorBuffer();
        TensorBuffer outputFeature3 = outputs.getOutputFeature3AsTensorBuffer();
        TensorBuffer outputFeature4 = outputs.getOutputFeature4AsTensorBuffer();
        TensorBuffer outputFeature5 = outputs.getOutputFeature5AsTensorBuffer();
        TensorBuffer outputFeature6 = outputs.getOutputFeature6AsTensorBuffer();
        TensorBuffer outputFeature7 = outputs.getOutputFeature7AsTensorBuffer();

        
        Toast.makeText(this, outputFeature0.toString(), Toast.LENGTH_LONG).show();
        
        Toast.makeText(this, "Toast-3 Model Run Successfully", Toast.LENGTH_SHORT).show();
        // Releases model resources if no longer used.
        model.close();
    } catch (IOException e) {
        // TODO Handle the exception
        Toast.makeText(this, "Toast-4 Scan Failed,Again Trying..", Toast.LENGTH_SHORT).show();
    }

    Toast.makeText(this, "Toast-5 Again startinng", Toast.LENGTH_SHORT).show();
    postInferenceCallback.run();
}
PRAJWAL
  • 11
  • 2
  • Please trim your code to make it easier to find your problem. Follow these guidelines to create a [minimal reproducible example](https://stackoverflow.com/help/minimal-reproducible-example). – Community Jun 08 '22 at 14:08

1 Answers1

0

Please check the answer on this thread How to pass image to tflite model in android. He used in his code the ImageProcessor object from tensorflow-lite package. I think it would be helpful to you. Please reply if it didn't work or you have any questions. Good luck.

  • I may use ImageProcessor but I am stucking in the part in which I need to take Outputs, so how can I get the output as string(code for it)? also I am confused in why there are 8 TensorBuffer elements for output. Plese help if possible. – PRAJWAL Jun 08 '22 at 14:51