0

i am beginner in tensorflow and i have one question related to save and restore checkpoint in Convolutional neural networks. I am trying to create CNN to classify faces. My question is:

If is it possible when i add new class in my dataset to make partial training? So i just want to retrain the new class instead to retrain the hole Network. is it possible to restore the weights and bias from previous training and train just the new class ?

i am using for save

saver = tf.train.Saver(tf.all_variables())
save = saver.save(session, "/home/owner//tensorflownew_models.ckpt")

print("Model saved in file: %s" % save)
mido
  • 69
  • 1
  • 9

2 Answers2

1

Your question has multiple facets. Let's look at each in detail:

  • Is it possible to restore the weights and bias from previous training? Yes. You can create a tf.train.Saver in your new program and use it to load the values of variables from an old checkpoint. If you only want to load some of the variables from an old model, you can specify the subset of variables that you want to restore in the var_list argument to the tf.train.Saver constructor. If the variables in your new model have different names, you may need to specify a key-value mapping, as discussed in this answer.

  • Is it possible to add a class to the network? Yes, although it's a little tricky (and there might be other ways to do it). I assume you have a softmax classifier in your network, which includes a linear layer (matrix multiplication by a weight matrix of size m * C, followed by adding a vector of biases of length C). To add a class, you could create a matrix of size m * C+1 and a vector of length C+1, then initialize the first C rows/elements of these from the existing weights using tf.Variable.scatter_assign(). This question deals with the same topic.

  • Is it possible to do partial training? Yes. I assume you mean "training only some of the layers, while holding other layers constant." You can do this as MMN suggests by passing an explicit list of variables to optimize when calling tf.train.Optimizer.minimize(). For example, if you were adding a class as above, you might retrain only the softmax weights and biases, and hold the convolution layers' filters constant. Have a look at the tutorial on transfer learning with the pre-trained Inception model for more ideas.

Community
  • 1
  • 1
mrry
  • 125,488
  • 26
  • 399
  • 400
0

Sure. You can use the var_list parameter in tf.train.Optimizer.minimize() to control the weights you want to optimize. If you don't include the variables you restored (or currently have trained,) they shouldn't get changed.

MMN
  • 576
  • 5
  • 7
  • @ MMN can you specified more please or if there is any tutorial about that or link – mido Sep 12 '16 at 13:08
  • [link](https://www.tensorflow.org/versions/r0.10/api_docs/python/train.html) TF is a little tricky to get used to, but basically, training is just updating weights in variables using a loss function and backprop. Be default, train() will update all variables involved, but if you have pretrained or restored some weights, you can avoid retraining them by just omitting them from the var_list parameter in your training step. – MMN Sep 12 '16 at 13:29