1

Can I export a model from AutoML tables and use it locally with tensorflow?

I've discovered that you can export some of the Vision models in multiple formats but for Tables I can only export a tf edges model to Cloud Storage. So can I run it locally somehow to perform predictions? Or is there another way to export/convert a model in a format that can be used locally?

enter image description here

John
  • 411
  • 1
  • 4
  • 15
  • You can also try MLJAR AutoML. It is an open-source python package. All models are transparent with automatic documentation. The GitHub repo https://github.com/mljar/mljar-supervised - The comparison of performance MLJAR vs. GCP Tables https://mljar.com/automl-compare/ – pplonski Apr 09 '21 at 06:53

1 Answers1

0

It is possible for AutoML tables to export a tf saved model (.pb format). You can do this by sending a request using curl. See Exporting Models in AutoML Tables on detailed instructions on how to use models.export.

NOTE: Exported model used for testing is from AutoML Tables quickstart.

JSON request (Use tf_saved_model to export a tensorflow model in SavedModel format):

{
  "outputConfig": {
    "modelFormat": "tf_saved_model",
    "gcsDestination": {
      "outputUriPrefix": "your-gcs-destination"
    }
  }
}

Curl command (assuming that the model is in us-central1):

curl -X POST \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
https://automl.googleapis.com/v1beta1/projects/your-project-id/locations/us-central1/models/your-model-id:export

Exported model: enter image description here

enter image description here

Ricco D
  • 6,873
  • 1
  • 8
  • 18
  • Hello and thank you for your answer! Do you have any clue on how to use the model after it is exported in python? How can I import it with tensorflow and run a prediction? – John Apr 09 '21 at 18:01
  • @RadiCho you can check https://www.tensorflow.org/guide/saved_model for more information on using the saved model. – Ricco D Apr 12 '21 at 00:18
  • If I try to use the SavedModel in Python or Node I get "Error: Failed to load SavedModel: Op type not registered 'ParseExampleV2' in binary " – John Apr 13 '21 at 06:56
  • @RadiCho this might help you out with the your current issue https://stackoverflow.com/a/59060526/14733669 . If ever you encounter another error, it is much better to create a new question so that the community can contribute. – Ricco D Apr 13 '21 at 07:32