If I understand variable_scope
correctly, then the following code would be expected to throw an error:
with tf.variable_scope('f', reuse=True):
slim.conv2d(x, 128, 7)
since reuse
is set to True
. However, it does not. I also tried:
with tf.variable_scope('f', reuse=True):
slim.conv2d(x, 128, 7, scope='Conv', reuse=True)
just to be sure and it did not throw an error either.
Finally, I expected the following code to throw an error because reuse
is set to False
:
for i in range(2):
with tf.variable_scope('f', reuse=False):
slim.conv2d(x, 128, 7, reuse=False)
print map(lambda v: v.name, tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES))
However, it did not throw an error either, and it only created a single set of weights and biases.
Am I misunderstanding the expected behaviour of reuse
?