10

Based on this converting-trained-tensorflow-model-to-protobuf I am trying to save/restore TF graph without success.

Here is saver:

with tf.Graph().as_default():
    variable_node = tf.Variable(1.0, name="variable_node")
    output_node = tf.mul(variable_node, 2.0, name="output_node")
    sess = tf.Session()
    init = tf.initialize_all_variables()
    sess.run(init)
    output = sess.run(output_node)
    tf.train.write_graph(sess.graph.as_graph_def(), summ_dir, 'model_00_g.pbtxt', as_text=True)
    #self.assertNear(2.0, output, 0.00001)
    saver = tf.train.Saver()
    saver.save(sess, saver_path)

which produces model_00_g.pbtxt with text graph description. Pretty much copy paste from freeze_graph_test.py.

Here is reader:

with tf.Session() as sess:

    with tf.Graph().as_default():
        graph_def = tf.GraphDef()
        graph_path = '/mnt/code/test_00/log/2016-02-11.22-37-46/model_00_g.pbtxt'
        with open(graph_path, "rb") as f:
            proto_b = f.read()
            #print proto_b   # -> I can see it
            graph_def.ParseFromString(proto_b) # no luck..
            _ = tf.import_graph_def(graph_def, name="")

    print sess.graph_def

which fails at graph_def.ParseFromString() with DecodeError: Tag had invalid wire type.

I am on docker container b.gcr.io/tensorflow/tensorflow:latest-devel in case it makes any difference.

Community
  • 1
  • 1
rgr
  • 543
  • 2
  • 6
  • 13

3 Answers3

19

The GraphDef.ParseFromString() method (and, in general, the ParseFromString() method on any Python protobuf wrapper) expects a string in the binary protocol buffer format. If you pass as_text=False to tf.train.write_graph(), then the file will be in the appropriate format.

Otherwise you can do the following to read the text-based format:

from google.protobuf import text_format
# ...
graph_def = tf.GraphDef()
text_format.Merge(proto_b, graph_def) 
mrry
  • 125,488
  • 26
  • 399
  • 400
  • thanks - these both way solve immediate problem. However main task is not here for me - i want to store graph+vars and load then for eval in other place. This [`tensorflow/tensorflow/python/tools/freeze_graph.py`](https://github.com/tensorflow/tensorflow/tree/00440e99ffb1ed1cfe4b4ea650e0c560838a6edc/tensorflow/python/tools) looks very much as what i need, but it is not in my docker image yet and usage isn't very clear wrt half of parameters - I think i will wait 0.7 for this and switch to other tasks for now. – rgr Feb 12 '16 at 20:26
  • getting the same kind of problem. Not able to solve, can you look into this https://stackoverflow.com/questions/71439124/google-protobuf-message-decodeerror-error-parsing-message-with-type-tensorflow – subbu Mar 14 '22 at 12:19
1

ParseFromString needs binary serialized protocol buffer, for human-readable representation you need to use text_format.Merge as used here

Yaroslav Bulatov
  • 57,332
  • 22
  • 139
  • 197
0

I tried to load model via the java API which accepts only binary. But in the python where we uses contrib.Estimator produces textual model file format. I found a model file converter online, seems it's working fine. This might solve the original issue too (use the binary model loader) if you have an existing text format model file.

kecso
  • 2,387
  • 2
  • 18
  • 29