I’ve been testing FastAI against a research paper. It’s performing well, but as I was tweaking the learn rate I got the curve going backwards. I’m not quite sure what to do.
Here is where I started:
This is what I ended up with:
As you can see on line 20 learn rate is going forwards than back. I’ve not really run into that before. Any advice? Am I slicing this incorrectly?
Try plotting before unfreezing
I just ran it and it looks exactly the same.
i’m not sure @charliec but can you try passing in values start_lr and end_lr to the lr_find.
i’m not sure what the parameters are called in v1.
In v2 it is something like this -
Just a guess i feel like it is wrapping around, i could be totally wrong though
is this sharable on collab or somewhere?
Here you’re reading how loss was affected by learning rate during training. Swap the axis and the graph makes a bit more sense. To get what you’re actually wanting, call learn.lr_find() again
your learning rate should be strictly increasing when you are trying to find the optimal learning rate, but instead it was increasing then decreasing (hence the curving back of the plot). maybe you accidentally activated a round of cyclical learning?
I have a colab here
If you change anything could you add some markdown info? I have a copy of this so feel free to tweak it.
I’m guessing, but maybe try
lr = 0.01 #1e-2
lr = 0.01