For transfer learning, one often uses a network as a feature extractor to create a dataset of features, on which another classifier is trained (e.g. a SVM).
I want to implement this using the Dataset API (tf.contrib.data
) and dataset.map()
:
# feature_extractor will create a CNN on top of the given tensor
def features(feature_extractor, ...):
dataset = inputs(...) # This creates a dataset of (image, label) pairs
def map_example(image, label):
features = feature_extractor(image, trainable=False)
# Leaving out initialization from a checkpoint here...
return features, label
dataset = dataset.map(map_example)
return dataset
Doing this fails when creating an iterator for the dataset.
ValueError: Cannot capture a stateful node by value.
This is true, the kernels and biases of the network are variables and thus stateful. For this particular example they don't have to be though.
Is there a way to make Ops and specifically tf.Variable
objects stateless?
Since I'm using tf.layers
I cannot simply create them as constants, and setting trainable=False
won't create constants neither but just won't add the variables to the GraphKeys.TRAINABLE_VARIABLES
collection.