Can’t find any explanation on that?
Percentage of total number of epochs when learning rate rises during one cycle.
Sorry, I still confused that one cycle in the new API only runs one epoch. How the percentage of total number of epochs works? Can you give a example? If learn.fit_one_cycle(10, slice(1e-4,1e-3,1e-2), pct_start=0.05)??
Ok, strictly correct answer would be percentage of iterations, so you can have lr both increase and decrease during same epoch. In your example, say, you have 100 iterations per epoch, then for half an epoch (0.05 * (10 * 100) = 50) lr will rise, then slowly decrease.
Thanks a lot~~~
Thanks for this explanation … so essentially, it is the percentage of overall iterations where the LR is increasing, correct?
So, given the default of 0.3, it means that your LR is going up for 30% of your iterations and then decreasing over the last 70%.
Is that a correct summation of what is happening?
Yes, I think that’s correct.
You can verify that by changing its value and check:
For example if
pct_start = 0.2
And if you change it to
pct_start = 0.8