Learning rates with cyc_len

Does learning rates gets initialized to their original values after each cycle or the initial learning rate of the next cycle is equal to the updated learning rate from the previous cycle? Are the following lines of code similar or different?

learn.fit_one_cycle(2, max_lr=slice(1e-2))

learn.fit_one_cycle(1, max_lr=slice(1e-2))
learn.fit_one_cycle(1, max_lr=slice(1e-2))

I believe these will be different: learning rate curve is scheduled over all the batches continuously. You can see this through learn.recorder.plot_lr(). The first run will be 2x more batches.

Ok, thanks.