0

I have a trained model in TensorFlow on Google Cloud Datalab. I want to export it and import it in BigQuery and predict using BigQuery. How do i export it with path as gs://*?

gogasca
  • 9,283
  • 6
  • 80
  • 125

2 Answers2

2

If you are using TensorFlow 1.14 or higher and Keras, then:

tf.saved_model.save(model,  'gs://bucket/dir')

See https://www.tensorflow.org/api_docs/python/tf/saved_model/save

Lak
  • 3,876
  • 20
  • 34
0

If you have code written in an earlier version of TensorFlow, it probably uses the Estimator API. In that case, use:

estimator.export_savedmodel('gs://bucket/dir', serving_input_fn)

where the serving function has to be defined with placeholders, one for each input to your model:

def serving_input_fn():
    feature_placeholders = {
        'input1': tf.placeholder(tf.string, [None]),
        'input2': tf.placeholder(tf.float32, [None]),
    }
    features = {
        key: tf.expand_dims(tensor, -1)
        for key, tensor in feature_placeholders.items()
    }
    return tf.estimator.export.ServingInputReceiver(features, feature_placeholders)
Lak
  • 3,876
  • 20
  • 34