3

Since 4 months I am working on a project using Mobile Vision Android [MVA]. Which only require to use the play service and this tutorial Codelab. However since the beginning of the month Google came with a new version Machine Learning Kit [MLK] with :

with new capabilities.

and they :

strongly encourage us to try it out

My problem is that the new MLK is base Firebase. That is to say, we have to use a google dev account, with this Setup and a lot of anoying things that strongly link our project with Google (in my mind, tell me if I'm wrong).

My first question [answered by @Ian Barber] is : is there a way to use MLK without all this setup with firebase ? Or use it the way I use MVA, just implement a dependencies and that's all ?

EDIT : My application was using the Codelab of [MVA]. Which mean I was able to detect text in video stream (from the camera). All the optimisation of frame capture, processing etc... was taking into account by multiple well constructed thread. But now there is no example of video processing with [MLK]. The implementation of Camera Source and Preview look almost impossible without the work of MVA capabilites, just with the MLK.

My Second Question (according to the migration) is : how to use the CameraSource, CameraSourcePreview. Like we used in MVA to work on a camera source for text Detection

Arnauld Alex
  • 339
  • 1
  • 3
  • 13
  • Hi! I'm facing the same issue with cameraSource right now. How did you solve yours? I tried using the cameraSource code from the ML Kit example, but it seems to be very far from perfect (tested on a couple of devices). Thanks in advance :) – Paktalin Aug 02 '18 at 16:53

2 Answers2

1

ML Kit has a broader set of features than mobilevision, so there are some which are dependent on a Firebase project. That said, you shouldn't be tied to Google any more than now if you just want to use on-device: while there are more steps, you don't have to actually use any of the other Firebase or ML Kit services if you don't want to!

The only extra app change (other than adding dependencies) for Firebase is setting up the plugin. This is really just a convenience tool to process the config file from Firebase into resource files. You can see what its doing in the documentation - and if you'd like, simply hardcode the resource values yourself.

Ian Barber
  • 19,765
  • 3
  • 58
  • 58
  • Thanks for your answer, it's clear and answer to the **first part** of my problem. Indeed I don't need all the other stuff present in Firebase. – Arnauld Alex May 30 '18 at 07:57
1

On the second part of your question:

how to use the CameraSource, CameraSourcePreview. Like we used in MVA to work on a camera source for text Detection?

Can you please take a look at the Android ML Kit Quickstart app? It contains sample app code to use camera source and preview with ML Kit.

Pannag Sanketi
  • 1,372
  • 1
  • 10
  • 10