2

Using Tensorflow (tf.contrib.slim in particular) we are required to calibrate a few parameters to produce the graphs that we want at tensorboard.

Saving a summary interval is more clear for us what it does. It saves the value (or an average of them?) of a particular point in the graph at the interval provided.

Now checkpoints for saving the model itself why should be required at the training process? Does the model changes?.. Not sure how this works

George Pligoropoulos
  • 2,919
  • 3
  • 33
  • 65

1 Answers1

0

You save the model to checkpoints because the Variables in the model, including neural network weights and biases and the global_step counter, keep changing during the training process. The structure of the model doesn't change. The saved checkpoints allow you to load the trained model for serving and to resume training later.

Shanqing Cai
  • 3,756
  • 3
  • 23
  • 36