I’ve been testing FastAI against a research paper. It’s performing well, but as I was tweaking the learn rate I got the curve going backwards. I’m not quite sure what to do.

i’m not sure @charliec but can you try passing in values start_lr and end_lr to the lr_find.
i’m not sure what the parameters are called in v1.
In v2 it is something like this - learn.lr_find(start_lr=1e-3, end_lr=10)
Just a guess i feel like it is wrapping around, i could be totally wrong though
is this sharable on collab or somewhere?

Here you’re reading how loss was affected by learning rate during training. Swap the axis and the graph makes a bit more sense. To get what you’re actually wanting, call learn.lr_find() again

your learning rate should be strictly increasing when you are trying to find the optimal learning rate, but instead it was increasing then decreasing (hence the curving back of the plot). maybe you accidentally activated a round of cyclical learning?