1

I'd like to swap out the multibox_model.pb being used by TensorFlowMultiBoxDetector.java in Google's Tensorflow Detect Sample App with the mobilenet frozen_inference_graph.pb included in the object detection API's model zoo.

I've ran the optimize_for_inference script on it but the tensorflowInferenceInterface can't parse the optimized model. It can, however, parse the original frozen_inference_graph.pb. I still think I need to modify this graph somehow to take in a square-sized input image, such as the 224x224 one that multibox_model.pb does.

Machavity
  • 30,841
  • 27
  • 92
  • 100
matteo411
  • 390
  • 2
  • 6

1 Answers1

5

I'm one of the developers --- just FYI, we'll be releasing an update to the Android Detection demo in the next few weeks to make it compatible with the Tensorflow Object Detection API, so please stay tuned.

Jonathan Huang
  • 1,560
  • 9
  • 12
  • thanks for the update! I was thinking that the frozen graph needed to be modified to work on Android but instead it's the app? – matteo411 Jul 02 '17 at 06:22
  • 1
    Yes, the app itself currently is incompatible (e.g. in the shape and name of the output tensors that it expects to get) and needs to be updated just a bit. – Jonathan Huang Jul 03 '17 at 06:57
  • We did! Announcement here: https://github.com/tensorflow/models/tree/master/object_detection#august-11-2017 – Jonathan Huang Aug 13 '17 at 17:02
  • When I try to run my own frozen model I still get: No OpKernel was registered to support Op 'Round' Could you explain how to optimize the model for inference to make it run on Android ? – Sistr Sep 06 '17 at 10:01