Here is my example code:
import tensorflow as tf
x = tf.Variable(5.0)
y = x**2-2.0*x+1.0
o = tf.train.AdamOptimizer(0.1)
t = o.minimize(y)
sess = tf.Session()
sess.run(tf.global_variables_initializer())
for i in range(50):
t.run(session=sess)
print(sess.run(t._lr))
print(sess.run(x))
But the t._lr is always 0.1, not what i expected (because AdamOptimizer is adaptive-learning rate). Can someone help me? Thanks.