I wrote neural-network using tensorflow tools. everything working and now I want to export the final weights of my neural network to make a single prediction method. How can I do this?
-
https://nathanbrixius.wordpress.com/2016/05/24/checkpointing-and-reusing-tensorflow-models/ – martianwars Dec 08 '16 at 08:32
3 Answers
You will need to save your model at the end of training by using the tf.train.Saver
class.
While initializing the Saver
object, you will need to pass a list of all the variables you wish to save. The best part is that you can use these saved variables in a different computation graph!
Create a Saver
object by using,
# Assume you want to save 2 variables `v1` and `v2`
saver = tf.train.Saver([v1, v2])
Save your variables by using the tf.Session
object,
saver.save(sess, 'filename');
Of course, you can add additional details like global_step
.
You can restore the variables in the future by using the restore()
function. The restored variables will be initialized to these values automatically.

- 6,380
- 5
- 35
- 44
-
1Is it possible to obtain raw data of the parameters? I want to run a tensorflow-trained-model on anther platform, how can I do that? – Wesley Ranger Dec 13 '16 at 08:50
-
2You can get the final value of the weights using `sess.run(weights)` and export them to a numpy array for instance – martianwars Dec 13 '16 at 08:55
-
That's what I need. Another problem: I used `tf.nn.rnn_cell.LSTMCell` in the net, how can I access the weight/bias of an `LSTMCell` object? – Wesley Ranger Dec 13 '16 at 08:59
-
1If you try to use `tf.all_variables()`, you will see the names of the weight and bias of the `LSTMCell`. You can improve this search by searching for variables under the `LSTMCell` scope only – martianwars Dec 13 '16 at 09:02
-
http://stackoverflow.com/questions/36533723/tensorflow-get-all-variables-in-scope – martianwars Dec 13 '16 at 09:03
-
1
The answer above is the standard way to save/restore session snapshot. However, if you want to export your network as a single binary file for further use with other tensorflow tools, you'll need to perform few more steps.
First, freeze the graph. TF provides the corresponding tool. I use it like this:
#!/bin/bash -x
# The script combines graph definition and trained weights into
# a single binary protobuf with constant holders for the weights.
# The resulting graph is suitable for the processing with other tools.
TF_HOME=~/tensorflow/
if [ $# -lt 4 ]; then
echo "Usage: $0 graph_def snapshot output_nodes output.pb"
exit 0
fi
proto=$1
snapshot=$2
out_nodes=$3
out=$4
$TF_HOME/bazel-bin/tensorflow/python/tools/freeze_graph --input_graph=$proto \
--input_checkpoint=$snapshot \
--output_graph=$out \
--output_node_names=$out_nodes
Having done that, you can optimize it for inference, or use any other tool.

- 4,762
- 2
- 25
- 44
If you simply need to access weights and biases of your neural net you can use the get_weights()
method from tf.keras.layers.Layer
#params vector includes weights and biases
params = your_model.get_weights()

- 66
- 5