What is the pct_start mean?


(Xin) #1

Can’t find any explanation on that?


(Ilia) #2

Percentage of total number of epochs when learning rate rises during one cycle.


(Xin) #3

Sorry, I still confused that one cycle in the new API only runs one epoch. How the percentage of total number of epochs works? Can you give a example? If learn.fit_one_cycle(10, slice(1e-4,1e-3,1e-2), pct_start=0.05)??


(Ilia) #4

Ok, strictly correct answer would be percentage of iterations, so you can have lr both increase and decrease during same epoch. In your example, say, you have 100 iterations per epoch, then for half an epoch (0.05 * (10 * 100) = 50) lr will rise, then slowly decrease.


(Xin) #5

Thanks a lot~~~


(WG) #6

Thanks for this explanation … so essentially, it is the percentage of overall iterations where the LR is increasing, correct?

So, given the default of 0.3, it means that your LR is going up for 30% of your iterations and then decreasing over the last 70%.

Is that a correct summation of what is happening?


(Haider Alwasiti) #7

Yes, I think that’s correct.

You can verify that by changing its value and check:
learn.recorder.plot_lr()

For example if pct_start = 0.2

And if you change it to pct_start = 0.8