0

I am trying to setup the decode functionality of textsum using tensorflow serving but I haven't been able fully make sense of what is fully necessary to perform via the MNIST tutorial. Has anyone come across any other tutorials on setting up Tensorflow serving models or even something more aligned to textsum? Any help or direction would be great. Thanks!

In the end I am trying to export the decode functionality from a model trained via 'train' in seq2seq_attention.py found here: https://github.com/tensorflow/models/blob/master/textsum/seq2seq_attention.py

When comparing the below 2 files to make sense of what I need to perform to the above textsum model, I am having difficulty in making sense of what needs to be assigned in the "default_graph_signature, input tensor, classes_tensor, etc" I realize that these may not be aligned with the textsum model,however this is what I am trying to clear up and figured perhaps if I saw some other models that were being exported to tensorflow serving, that it may perhaps make a bit more sense.

Comapred: https://github.com/tensorflow/tensorflow/blob/r0.11/tensorflow/examples/tutorials/mnist/mnist_softmax.py

and

https://github.com/tensorflow/serving/blob/master/tensorflow_serving/example/mnist_export.py

------------------ Edit -------------------

Below is what I have so far but I am having a few issues. I am trying to setup the Textsum Eval functionality for serving. First I am getting an error stating "no variables to save" when the assignment of Saver(sharded=True) occurs. That aside, I also don't understand what I am supposed to assign to the "classification_signature" and the "named_graph_signature" variables for the exporting of the results via textsum decode.

Any help on what I'm missing here...sure it is a bit.

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function

import sys
import tensorflow as tf
from tensorflow.contrib.session_bundle import exporter

tf.app.flags.DEFINE_string("export_dir", "exports/textsum",
                           "Directory where to export textsum model.")

tf.app.flags.DEFINE_string('checkpoint_dir', 'log_root',
                            "Directory where to read training checkpoints.")
tf.app.flags.DEFINE_integer('export_version', 1, 'version number of the model.')
tf.app.flags.DEFINE_bool("use_checkpoint_v2", False,
                     "If true, write v2 checkpoint files.")
FLAGS = tf.app.flags.FLAGS

def Export():
    try:
        saver = tf.train.Saver(sharded=True)
        with tf.Session() as sess:
            # Restore variables from training checkpoints.
            ckpt = tf.train.get_checkpoint_state(FLAGS.checkpoint_dir)
            if ckpt and ckpt.model_checkpoint_path:
                saver.restore(sess, ckpt.model_checkpoint_path)
                global_step = ckpt.model_checkpoint_path.split('/')[-1].split('-')[-1]
                print('Successfully loaded model from %s at step=%s.' %
                    (ckpt.model_checkpoint_path, global_step))
            else:
                print('No checkpoint file found at %s' % FLAGS.checkpoint_dir)
                return

            # Export model
            print('Exporting trained model to %s' % FLAGS.export_dir)
            init_op = tf.group(tf.initialize_all_tables(), name='init_op')
            model_exporter = exporter.Exporter(saver)

            classification_signature = <-- Unsure what should be assigned here

            named_graph_signature = <-- Unsure what should be assigned here

            model_exporter.init(
                init_op=init_op,
                default_graph_signature=classification_signature,
                named_graph_signatures=named_graph_signature)

            model_exporter.export(FLAGS.export_dir, tf.constant(global_step), sess)
            print('Successfully exported model to %s' % FLAGS.export_dir)
    except:
        err = sys.exc_info()
        print ('Unexpected error:', err[0], ' - ', err[1])
        pass


def main(_):
    Export()

if __name__ == "__main__":
    tf.app.run()
MBT
  • 21,733
  • 19
  • 84
  • 102
xtr33me
  • 936
  • 1
  • 13
  • 39
  • Simply export the model after you done the training: https://github.com/tensorflow/serving/blob/master/tensorflow_serving/example/mnist_export.py#L91. Could you add more details to your question? – yuefengz Nov 15 '16 at 02:23
  • So there is nothing special I need to do with regards to the "training" portion of mnist_export? I was trying to figure out what had been performed in the training portion that was specific for exporting for tesnroflow serving. If I understand this correctly, I should really just be focusing on the "Export model" portion of the file? – xtr33me Nov 15 '16 at 02:30
  • 1
    You don't have to change the training part. Both the mnist and textsum training portion would populate a graph(or meta-graph) and variables which are enough to exported for serving. – yuefengz Nov 15 '16 at 05:20
  • Thanks @Fake ! I have edited the original post above with some areas I am having more of a specific issue with. Any light you might be able to shed? Currently I am just looking at setting up the decode functionality in TF Serving. – xtr33me Nov 27 '16 at 04:10

0 Answers0