We have seems something similar to this, even in class
This is from DeepLearning-LecNotes2
Is this graph accurate?
I’m assuming we are talking about constant learning rate, but in these graphs, it likes like the learning rate is actually growing?
I’m more convinced with
from here
But if the second graph is the truth, then high learning rate will not cause it to diverge right, it will just stabilize at a relatively higher error rate?
Or is there some math behind it, and graph one is the real deal?