when i use this function to print optimizer iterations during training it gives me this output
epochs 1/100
step adam iterations
0 1.000
1 1.500
2 2.000
3 2.500
....
99 50.500
epochs 2/100
step adam iterations
0 101.000
1 101.500
2 102.000
3 102.500
and so on .
def get_lr_metric(optimizer):
def lr(y_true, y_pred):
return optimizer.iterations
return lr
model.compile(, metrics=[get_lr_metric(optimizer)])
i can not understand how adam optimizer iteration could be float with step equal t0 0.5
i expect the output to be like this
epochs 1/100
step adam iterations
0 1
1 2
2 3
3 4
....
99 100
epochs 2/100
step adam iterations
0 101
1 102
2 103
3 104