2

I need to restore a DNN(VGG16Net), and use transfer learning to construct another network. So in here I need to convert some filter, bias variables from trainable tensorflow variable to not trainable variable (I'm using native tensorflow framework, not keras or any higher lever packages).

for example to get weights from convolution layer 4_1 i used conv4_3_filter=sess.graph.get_tensor_by_name('conv4_3/filter:0') but the variable ''conv4_3_filter'' always is trainabled variables. so, In here i'm trying to find a general way to convert any tensorflow variable from trainable to not trainable. How can I solver this?

  • I don't know how to turn trainable variables to non-trainable variables if that is possible, but instead you could leave the variables trainable but only train the other variables, see e.g. .https://stackoverflow.com/questions/54303730/retraining-a-cnn-without-a-high-level-api/54304541#54304541 – tomkot Apr 11 '19 at 07:28

1 Answers1

1

I don't think it is possible to modify trainable attribute of the tf.Variable. However, there are multiple workarounds.

Suppose you have two variables:

import tensorflow as tf

v1 = tf.Variable(tf.random_normal([2, 2]), name='v1')
v2 = tf.Variable(tf.random_normal([2, 2]), name='v2')

When you are using tf.train.Optimizer class and its sub-classes to optimize, by default it takes variables from tf.GraphKeys.TRAINABLE_VARIABLES collection. Every variable that you define with trainable=True is added to this collection by default. What you can do is to clear this collection and append to it only those variables that you're willing to optimize. For example, if I want to optimize only v1 but not v2:

var_list = tf.trainable_variables()
print(var_list)
# [<tf.Variable 'v1:0' shape=(2, 2) dtype=float32_ref>,
#  <tf.Variable 'v2:0' shape=(2, 2) dtype=float32_ref>]

tf.get_default_graph().clear_collection(tf.GraphKeys.TRAINABLE_VARIABLES)

cleared_var_list = tf.trainable_variables()
print(cleared_var_list)
# []

tf.add_to_collection(tf.GraphKeys.TRAINABLE_VARIABLES, var_list[0])

updated_var_list = tf.trainable_variables()
print(updated_var_list)
# [<tf.Variable 'v1:0' shape=(2, 2) dtype=float32_ref>]

Another way is to use var_list keyword argument of the optimizer and pass those variables you want to be updated during training (during execution of the train_op):

optimizer = tf.train.GradientDescentOptimizer(0.01)
train_op = optimizer.minimize(loss, var_list=[v1])
Vlad
  • 8,225
  • 5
  • 33
  • 45