0

I want to ship multiple custom model with the app. All the models have same purpose. I don't want to host it as of now.

Custom Model code github

FirebaseLocalModelSource localModelSource =
        new FirebaseLocalModelSource.Builder(LOCAL_MODEL_NAME)
                .setAssetFilePath(LOCAL_MODEL_PATH).build();
// add multiple

FirebaseModelManager manager = FirebaseModelManager.getInstance();
manager.registerLocalModelSource(localModelSource);

// access multiple
  1. How could I access multiple model?
  2. Even if I host it, then how could I access different custom model for the same purpose?

If the above could not be achieved using mlkit, is there any other approach to combine the result of all the model?

Anurag Singh
  • 6,140
  • 2
  • 31
  • 47

2 Answers2

2

I think we do not support running multiple models in one inference yet. If you want to run different models in different scenarios, you could assign them with different names, and use different model sources to trigger them.

If you could describe more clearly about your use case, we could see how to support it in the future.

Shiyu
  • 875
  • 4
  • 5
0

The LOCAL_MODEL_NAME variable in the snippet above points to the model file you bundle with your app. You would just change that value to point to one of the models you have bundled with your app.

It works similarly when the models are hosted. Each model will have different name, so you'd just pass the name of the one you want to use.

Sachin K
  • 339
  • 4
  • 14
  • Thankyou for the suggestion. This is not he answer I am expecting. I am very well aware of the suggestion that you have suggested. However, I am interested in knowing how to use multiple models at once. – Anurag Singh Aug 11 '18 at 05:16
  • Ah sorry. I understand now. I don't think that's supported, but let me confirm with the team. I'm assuming dynamically switching models wouldn't work for you (i.e. registering one, running an inference, then registering the other and running an inference)? – Sachin K Aug 11 '18 at 14:00