I don't think it is possible to modify trainable
attribute of the tf.Variable
. However, there are multiple workarounds.
Suppose you have two variables:
import tensorflow as tf
v1 = tf.Variable(tf.random_normal([2, 2]), name='v1')
v2 = tf.Variable(tf.random_normal([2, 2]), name='v2')
When you are using tf.train.Optimizer
class and its sub-classes to optimize, by default it takes variables from tf.GraphKeys.TRAINABLE_VARIABLES
collection. Every variable that you define with trainable=True
is added to this collection by default. What you can do is to clear this collection and append to it only those variables that you're willing to optimize. For example, if I want to optimize only v1
but not v2
:
var_list = tf.trainable_variables()
print(var_list)
# [<tf.Variable 'v1:0' shape=(2, 2) dtype=float32_ref>,
# <tf.Variable 'v2:0' shape=(2, 2) dtype=float32_ref>]
tf.get_default_graph().clear_collection(tf.GraphKeys.TRAINABLE_VARIABLES)
cleared_var_list = tf.trainable_variables()
print(cleared_var_list)
# []
tf.add_to_collection(tf.GraphKeys.TRAINABLE_VARIABLES, var_list[0])
updated_var_list = tf.trainable_variables()
print(updated_var_list)
# [<tf.Variable 'v1:0' shape=(2, 2) dtype=float32_ref>]
Another way is to use var_list
keyword argument of the optimizer and pass those variables you want to be updated during training (during execution of the train_op
):
optimizer = tf.train.GradientDescentOptimizer(0.01)
train_op = optimizer.minimize(loss, var_list=[v1])