Thanks @jeremy for that feedback!
The learning rate finder happens to be mentioned in the CLR paper, but has nothing to do with CLR otherwise
I mentioned the upper and lower rate finder as mentioned in the paper, was it not it’s original contribution?
SGDR is not an optimizer, it’s a different annealing schedule
Aah… correcting now.
CLR is an annealing schedule that’s not used at all in the fastai library.
Yes I mentioned that ‘The fastai library uses CLR to find an optimal LR and SGDR as the optimizer.’ Now that I see your post, you mentioned fastai used the idea from CLR. I will correct it. Will that be appropriate statement?
So I’d suggest covering SGDR, not CLR, since SGDR solves the same problem but better
Absolutely! I am going through that paper ![]()