Learning Rate not showing restarts

I was trying to compare decrease in loss function with and without using SGD with restarts.
I wanted to test these to code blocks:

With restarts:
learn.fit(lr, 3, cycle_len=1, cycle_mult=2)

Without restarts:
learn.fit(lr, 1, cycle_len=7, cycle_mult=1)

But the graph for 1st code block, isn’t showing SGD with restarts as shown in the lectures. Am I missing something.

Both approaches should give you restarts. You are plotting the loss, this is the command to plot the learning rate: learn.sched.plot_lr()

(I am guessing you are using restarts and they are only not apparent from the way loss decreases, would suggest plotting the lr to confirm, btw the without restarts method will not give you a single restart with those parameters)

Thanks for the help.
I hoped loss would show a similar trend to lr (as shown in lectures), but it didn’t.

Both approaches should give you restarts.

learn.fit(lr, 1, cycle_len=7, cycle_mult=1)
This would also not have any restart, as its a single cycle with 7 epochs. Thanks for the without restarts method though.