I'm making an object detection app for Android, I got good performance while training with ssd_mobilenet_v1_fpn model.
I exported frozen inference graph, converted to tflite and quantized it to improve performance. But when i try it on TensorFlow Lite Object Detection Android Demo the app crashes.
The app works perfectly with the default model (ssd_mobilenet_v1) but unfortunately isn't good for small objects detection and classification.
Here my quantized ssd_mobilenet_v1_fpn model:
Google Drive: https://drive.google.com/file/d/1rfc64nUJzHQjxigD6hZ6FqxyGhLRbyB1/view?usp=sharing
Here the unquantized model:
Googe Drive: https://drive.google.com/file/d/11c_PdgobP0jvzTnssOkmcjp19DZoBAAQ/view?usp=sharing
For quantization i used this command line:
bazel run -c opt tensorflow/lite/toco:toco -- \ --input_file=tflite_graph.pb \ --output_file=detect_quant.tflite \ --input_shapes=1,640,480,3 \ --input_arrays=normalized_input_image_tensor \ --output_arrays=TFLite_Detection_PostProcess,TFLite_Detection_PostProcess:1,TFLite_Detection_PostProcess:2,TFLite_Detection_PostProcess:3 \ --inference_type=QUANTIZED_UINT8 \ --mean_values=128 \ --std_values=128 \ --change_concat_input_ranges=false \ --allow_custom_ops --default_ranges_min=0 --default_ranges_max=6
I also tried tflite converter python api, but it doesn't work for this model.
Here the android logcat errors: Errors
2020-09-16 18:54:06.363 29747-29747/org.tensorflow.lite.examples.detection E/Minikin: Could not get cmap table size!
2020-09-16 18:54:06.364 29747-29767/org.tensorflow.lite.examples.detection E/MemoryLeakMonitorManager: MemoryLeakMonitor.jar is not exist!
2020-09-16 18:54:06.871 29747-29747/org.tensorflow.lite.examples.detection E/BufferQueueProducer: [] Can not get hwsched service
2020-09-16 18:54:21.033 29747-29786/org.tensorflow.lite.examples.detection A/libc: Fatal signal 6 (SIGABRT), code -6 in tid 29786 (inference)
Has anyone managed to use an fpn model on android? or a model other than ssd_mobilenet_v1?