# New LR Finder Output?!

I ran lr_find today, and the output looks different. Fancy! What’s the deal with this new functionality? I’m a bit out of the loop.

3 Likes

There is a nice write up of this work by @muellerzr in this blog post.

1 Like

Thanks @Danielvs!

Yes, this is the fruits of work I had been doing until about March or so.

It originated in this thread from a few years back, but I will give a TL;DR to catch folks up to speed

Essentially before we had the two suggesters from the one-cycle paper, a global min and an estimated steepest slope (now called `steep` and `minimum`). We found that these tend to not give the best estimations for a good learning rate, and two other methods were created: the `valley` algorithm and the `slide` algorithm. They’re two other methods for getting a much better suggested learning rate to use OOTB without needing any graphical interpretation.

Valley and Slide can be used intermittently (or together as I’ll show in a moment!), but in general we found that: Valley > Slide > Steep > Minimum. Hence why the new default is `valley`.

Now let’s bring it back to the LR Finder. You can pass in any or all of those above methods to `lr_find`, and have all of them plotted on the same graph! So then you can interpret for yourself which LR you want to go with.

To do so all you need to do is:

``````learn = Learner(...) # any cnn_, tabular_, etc will work
lrs = learn.lr_find(suggest_funcs=(minimum, steep, valley, slide))
``````

And this will then give you a pretty graph with those all plotted:

And the resulting return gives you a namespace object with all of those values stored. So ex you can do `lrs.valley` to get the valley result, `lrs.minimum` to get the minimum, or just do `lr_min, lr_steep, lr_valley, lr_slide = learn.lr_find(....)` to get them returned too!

Along with this, it’s now super easy to write your own suggestion algorithm that you may want to experiment with. All you need to do is write your own function that accepts three params, as detailed in the documentation here: Hyperparam schedule | fastai

I hope that answers your questions Very happy to finally see this in the library, I’ve been baking it since March

Also, here is the table from our experiments.

A key:

• Leslie Smith = Steep
• ESRI = Valley
• Novetta = Slide

29 Likes

Wow, amazing! Thanks so much for all your hard work. So far, it does look like “Valley” is making good choices, and I’ve been able to just use them without giving it too much thought. Before, I would usually spend time squinting at the graph and stressing out about making the right choice. Definitely a big improvement!

2 Likes

This is very cool!

Usage doubt: If we do a `fit_one_cycle` after `lr_find` do the valley LRs get passed on to the learner automatically or do we have to do something like

`learner.fit_one_cycle(10, lr= lrs, wd=0.2)`

if I dont pass a lr parameter to the function, what `lr` does it use? The default in the function, or the new one discovered by `lr_find`?

The default (look at defaults.lr). You still need to specify the LR yourself, as normal. It won’t know what to do if you pass in lrs

thank you. not gonna pass the lrs object,

Great addition to the library. Thanks @muellerzr

I noted that the 4 suggestion methods do not return the same type:

`SuggestedLRs(minimum=0.005754399299621582, steep=0.005248074419796467, slide=tensor(0.0021), valley=tensor(0.0021))`

• minimum and steep return a float
• slide and valley return a tensor

This means that suggestion from minimum and steep can be passed directly into `fit_one_cycle`, while we need to extract the float value for slide and valley. with `.item()`

`lr_min, lr_steep, lr_slide, lr_valley = learn.lr_find(suggest_funcs=(minimum, steep, slide, valley))`
`learn.fit_one_cycle(n_epoch=3, lr_max=lr_valley)`
`TypeError: len() of a 0-d tensor`.

To make it work, need to pass `lr_valley.item()`
`learn.fit_one_cycle(n_epoch=3, lr_max=lr_valley.item())`

The problem does not occur with `learn.fine_tune`

Is this intended, or do I miss something?
Otherwise, the float could be extracted in the suggestion method itself.

It’s unintended, a PR to cast them all as floats would be welcome

Will do

Amazing information, @muellerzr !!! Thanks for all your support in this community.

I have tried to utilize the suggest_funcs in lr_find(), but keep getting an error:
“Unexpected Keyword Argument suggest_funcs in lr_find()”.

Is this available for fastai v2?

1 Like

Yup, and only fastai v2. You need the latest version which is 2.4 iirc

2 Likes

Updated to 2.4 and using the suggest_funcs worked!!! Thanks again

IT is showing the following error on Colab pro.

``````NameError Traceback (most recent call last)

[<ipython-input-19-d81c6bd29d71>](https://localhost:8080/#) in <module>() ----> 1 learn.lr_find()

[<ipython-input-11-856ffcf8a1e4>](https://localhost:8080/#) in lr_find(self, start_lr, end_lr, num_it, stop_div, show_plot, suggestions) 36 37 if suggestions: ---> 38 return SuggestedLRs(lr_min/10.,lr_steep)

NameError: name 'SuggestedLRs' is not defined
``````

I have imported all the modules. I am not sure why it is showing this error.
I was running this notebook Using the Learning Rate Finder (Beginner) | walkwithfastai

Got it, how to solve it.

PR is done and into master. Steep learning curve as it was the first time, but I learned something and now I am ready for more .

4 Likes

Thanks @muellerzr

Hi!

This looks like great functionality, but I can’t get it to work. Using

lr = learn.lr_find(suggest_funcs=(minimum))

I get the error “NameError: name ‘minimum’ is not defined”. I assume I need to loads the minimum functon somehow, but I can’t figure out how from the documentation. How should I use this?

Regards,
Sten

Hi Sten,

To make a tuple with one item, you need to write
`(minimum,)`

Otherwise Python interprets the parentheses as operator groupings.

HTH,

What do your import statements look like?

Do you import “all”, as in for example, `from fastai.vision.all import *`? If not, then you need to import it: `from fastai.callback.schedule import minimum`.

And also, does it also work if you try not wrap it as a tuple (or must it be tuplify)? like `lr = learn.lr_find(suggest_funcs=minimum)`?

1 Like