 # Chapter 10 Learning Finder for Text Applications

I have a number of questions:

1. Why can’t you rerun learning rate finder between each unfreeze?

2. Where do these values for LR come from in 10_nlp.ipynb and how were they picked?
learn.fit_one_cycle(1, 2e-2)

learn.freeze_to(-2)
learn.fit_one_cycle(1, slice(1e-2/(2.6**4),1e-2))

learn.unfreeze()
learn.fit_one_cycle(2, slice(1e-3/(2.6**4),1e-3))

3. How do we decide on the “(2.6**4)” in the slice???

|Hi Gregory

Not a really helpful example but Jeremy said on one of the videos (I can not remember the course year) but he found 2.6 works well.

Regards Conwyn

Correct.

Generally we do not find a learning rate in between each fit, you stick with the first finding and then adjust it by this 2.6 ** 4

You can now, and with the new lr suggesters it’s more reliable to do so, just it takes a bit longer. The 2e-2, 1e-2, 1e-3, etc is just a rule of thumb he used that works

Thanks for the quick response. I was wondering if it as rule of thumb or actually calculated.

I am running v2 and I have am having trouble running it after then first fit. Are there examples anywhere?

When I used it on my example I fit a LR and then used it all the way through. so I think I used

1. learn.fit_one_cycle(1, 5e-3)
learn.freeze_to(-2)
learn.fit_one_cycle(1, slice(5e-3/(2.6 ** 4),5e-3))
learn.unfreeze()
learn.fit_one_cycle(2, slice(5e-3/(2.6 ** 4),5e-3))