2

I want to log the learning rate of adam optimizer with estimator of tensorflow like this:

def def model_fn(features, labels, mode):
    ...
    optimizer = tf.train.AdamOptimizer(learning_rate=0.1)
    log_hook = tf.train.LoggingTensorHook({"lr" : optimizer._lr_t}, every_n_iter=10)
    return tf.estimator.EstimatorSpec(mode, loss=loss, train_op=train_op, training_hooks=[log_hook]) 
    ...

We know that the learning rate of tf.train.AdamOptimizer decays itself. But my result is always 1.0 like this:

INFO:tensorflow:lr = 0.1 (4.537 sec)
INFO:tensorflow:global_step/sec: 2.18827
INFO:tensorflow:loss = 8.285036e-07, step = 16180 (4.570 sec)
INFO:tensorflow:lr = 0.1 (4.570 sec)
INFO:tensorflow:global_step/sec: 2.21156
INFO:tensorflow:loss = 8.225431e-07, step = 16190 (4.521 sec)
INFO:tensorflow:lr = 0.1 (4.521 sec)

Am I do the right way for log learning rate of AdamOptimizer?

Update: I log the optimizer._lr referenced this answer, but got this error:

ValueError: Passed 0.1 should have graph attribute that is equal to current graph <tensorflow.python.framework.ops.Graph object at 0x7f96a290a350>.
tidy
  • 4,747
  • 9
  • 49
  • 89
  • You might want to have a look to this answer: https://stackoverflow.com/a/46043209/4282745 – pfm Oct 16 '18 at 07:34
  • Possible duplicate of [Learning rate doesn't change for AdamOptimizer in TensorFlow](https://stackoverflow.com/questions/38882593/learning-rate-doesnt-change-for-adamoptimizer-in-tensorflow) – pfm Oct 16 '18 at 07:34

0 Answers0