0

If I understand variable_scope correctly, then the following code would be expected to throw an error:

with tf.variable_scope('f', reuse=True):
    slim.conv2d(x, 128, 7)

since reuse is set to True. However, it does not. I also tried:

with tf.variable_scope('f', reuse=True):
    slim.conv2d(x, 128, 7, scope='Conv', reuse=True)

just to be sure and it did not throw an error either.

Finally, I expected the following code to throw an error because reuse is set to False:

for i in range(2):
    with tf.variable_scope('f', reuse=False):
        slim.conv2d(x, 128, 7, reuse=False)
print map(lambda v: v.name, tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES))

However, it did not throw an error either, and it only created a single set of weights and biases.

Am I misunderstanding the expected behaviour of reuse?

Jack Valmadre
  • 888
  • 7
  • 23
  • Yes, as far as I remember the layers in slim are not implemented using `tf.get_vaiable()`. That's why it is not vorking with `tf.variable_scope` as expected. – Temak Mar 16 '17 at 22:12
  • 1
    I think it does: `contrib.slim.conv2d` is equal to `contrib.layers.convolution` which creates an object of type `layers.convolutional.Conv2D` which is a sub-class of `_Conv` whose `build()` method calls `get_variable` here https://github.com/tensorflow/tensorflow/blob/e895d5ca395c2362df4f5c8f08b68501b41f8a98/tensorflow/python/layers/convolutional.py#L133 – Jack Valmadre Mar 17 '17 at 09:15
  • I think it might be due to a `variable_scope` with a `custom_getter` in `contrib/layers/python/layers/layers.py`, but I'm not sure – Jack Valmadre Mar 17 '17 at 09:18

0 Answers0