0

Is there a good way to deploy a model built using tflearn.DNN class to Google Cloud ML Engine? It seems like SavedModel requires input and output tensors to be defined in the prediction signature definition but unsure how to get that from tflearn.DNN.

Shaurav Garg
  • 439
  • 5
  • 12

1 Answers1

0

I figured this out later at least for my specific case. This snippet lets you export your DNN as a SavedModel which can then be deployed to Google Cloud ML Engine.

Snippet is below with the following arguments

  • filename is the export directory
  • input_tensor is the input_data layer given to tflearn.DNN
  • output_tensor is the entire network passed to tflearn.DNN
  • session is an attribute of the object returned by tflearn.DNN

    builder = tf.saved_model.builder.SavedModelBuilder(filename)
    
    signature = tf.saved_model.signature_def_utils.predict_signature_def(
        inputs={'in':input_tensor}, outputs={'out':output_tensor})
    builder.add_meta_graph_and_variables(session,
                                         [tf.saved_model.tag_constants.SERVING],
                                         signature_def_map={'serving_default':signature})
    
    builder.save()
    
    serving_vars = {
        'name':self.name
    }
    
    assets = filename + '/assets.extra'
    os.makedirs(assets)
    
    with open(assets + '/serve.pkl', 'wb') as f:
        pickle.dump(serving_vars, f, pickle.HIGHEST_PROTOCOL)
    
Shaurav Garg
  • 439
  • 5
  • 12
  • Can you explain what you're using serving_vars for? – rhaertel80 Jul 17 '17 at 16:30
  • Before I answer, just know that I am fairly new to using TensorFlow :) So in this specific case, the serving_vars are likely not critical. But I have another model which has an embedded word2vec model in it. For that, I have the embeddings dictionary as a serving var because it is used in one of the layers. – Shaurav Garg Jul 17 '17 at 16:42
  • I just want to make sure that you're able to use what you export. Things in assets.extra aren't used automatically. When it comes to an embedding, I would presume that it would be stored as a TensorFlow checkpoint/variable(s). Is that the case? – rhaertel80 Jul 17 '17 at 20:28
  • Yea you are right. I tried removing the serving_vars and it didn't cause any changes. – Shaurav Garg Jul 20 '17 at 18:42