I’d like print out the learning rate for each training step of my nn.
I know that Adam has an adaptive learning rate, but is there a way i can see this (for visualization in tensorboard)
All the optimizers have a private variable that holds the value of a learning rate.
So you will just need to print
sess.run(optimizer._lr) to get this value. Sess.run is needed because they are tensors.
Answered By – Salvador Dali