Does is it internally unroll the graph, i.e. does it create a static graph on the fly with copies of the loop body subgraph, or does it only manage the the forward activations for backprop at different memory locations for each loop iteration, without extending the graph explicitly.
Asked
Active
Viewed 1,102 times
1 Answers
0
Such loops are not explicitly unrolled. What I mean by that is that if the tf.while_loop
needs to run 100 times, your graph will not have 100 calls to the body function. For example, let's suppose we want to use tensorflow to compute x^N
(or x**N
in python parlance). One way of doing that would be this:
import tensorflow as tf
N = tf.constant(100)
i = tf.constant(0)
x = tf.constant(1.5)
def body(i, x, x0):
return i + 1, x * x0, x0
output = tf.while_loop( lambda i,x,x0: i < N-1, body, [i, x, x] );
with tf.Session() as sess:
writer = tf.summary.FileWriter("while_loop_example")
writer.add_graph(sess.graph)
print(sess.run(output))
The body of this loop needs to run 99 times. However, if we look at the graph (by using the command "tensorboard --logdir while_loop_example
"), we get this:
The graph does not change if I change the number of loop iterations.

bremen_matt
- 6,902
- 7
- 42
- 90