How to implement Stochastic Gradient Descent with Restarts in fastai v1?

How can I call the following function in fastai library 1.0.x? This is supposed to implement stochastic gradient descent with restarts in fastai v 0.7

lr=np.array([1e-4,1e-3,1e-2]) # different learning rate
learn.fit(15, lr, cycle_mult=2, n_cycle=3, cycle_len=1)

This doesn’t work in v1. The new default is 1cycle.

What about this module?

https://docs.fast.ai/callbacks.general_sched.html#TrainingPhase