0

Here is my example code:

import tensorflow as tf 

x = tf.Variable(5.0)
y = x**2-2.0*x+1.0
o = tf.train.AdamOptimizer(0.1)
t = o.minimize(y)
sess = tf.Session()
sess.run(tf.global_variables_initializer())
for i in range(50):
    t.run(session=sess)
    print(sess.run(t._lr))
print(sess.run(x))

But the t._lr is always 0.1, not what i expected (because AdamOptimizer is adaptive-learning rate). Can someone help me? Thanks.

  • Possible duplicate of [Getting the current learning rate from a tf.train.AdamOptimizer](https://stackoverflow.com/questions/36990476/getting-the-current-learning-rate-from-a-tf-train-adamoptimizer) – P-Gn Jul 18 '17 at 05:49

1 Answers1

0

I used this line after the AdamOptimizer

print("opt.get_name(): ",opt.get_name(),"opt._lr: ",opt._lr,"opt._lr_t: ",opt._lr_t)  #jll1

What is indicated in: Defined in tensorflow/python/training/adam.py

and the same after applying: compute_gradients and apply_gradients

print("opt.get_name(): ",opt.get_name(),"opt._lr: ",opt._lr,"opt._lr_t: %f "% (sess.run(opt._lr_t)))  #jll1

The results are the same as yours, is something not being done well? Should it vary?

('opt.get_name(): ', 'Adam', 'opt._lr: ', 0.0001, 'opt._lr_t: ', None) # 1st
('opt.get_name(): ', 'Adam', 'opt._lr: ', 0.0001, 'opt._lr_t: 0.000100 ') # 2nd

I used your code changing:

#    print(sess.run(t._lr))
print("o.get_name(): ",o.get_name(),"o._lr: ",o._lr,"o._lr_t: %f "% (sess.run(o._lr_t)))

The value obtained is: 1.22994

Joe Llerena
  • 144
  • 1
  • 5