How do you find the learning rate using the learning rate finder?


I am having trouble understanding the explanation in the official fastai book(p.206~207) concerning how to find an appropriate learning rate using the learning rate finder.

When I run the learning rate finder using:
learn = cnn_learner(dls, resnet34, metrics=error_rate)
lr_min,lr_steep = learn.lr_find()
print(f"Minimum/10: {lr_min:.2e}, steepest point: {lr_steep:.2e}")

I get:
Minimum/10: 8.32e-03, steepest point: 6.31e-03

The author mentions that the best way to find the learning rate is:

Our advice is to pick either of these:

*One order of magnitude less than where the minimum loss was achieved(i.e., the minimum divided by 10)
*The last point where the loss was clearly decreasing

What I don’t understand is how did the authors jump to a learning rate of 3e-03 from Minimum/10 being 8.32e-03? Shouldn’t it be
learn.fine_tune(2, base_lr=8.32e-03)
as per the authors’ above tips on finding the learning rate?

I don’t have too much mathematical background so please forgive me if it’s supposed to be a no-brainer.

The learning rate finder is not an exact science, actually if you run it again, you will get different values.

What is important however is that it provides a good approximation of the optimal value. This is the reason why the book gives “rules of thumb” but no precise way to get the perfect value.

From my experience, as long as you stay in the same order of magnitude, you should be fine (e.g you won’t find much difference between training you model with base_lr=8.32e-03 or =6.31e-03. Authors (and fastai) usually use the value 3e-03 because according to their experience, it is a good default value.


I would like to know if there is any way to apply lr_finder to nifti files?