Confusion about cycle_len

In lots of dl1 codes, I saw lots of cycle_len parameter in learn.fit() call.
For example:
learn.fit(lr, 3, cycle_len=1, cycle_mult=2, wds=wd)

while in /fastai/basic_train.py, I don’t see it is defined in
def fit(self, epochs: int, lr: Union[Floats, slice]=default_lr,
wd: Floats=None, callbacks: Collection[Callback]=None) -> None:

Even in the whole basic_train.py, I don’t see cycle_len

Very interesting! I didn’t think of digging into the source; I only tried different combinations of cycle, cycle_len and cycle_mult to see the relation. Now that you inspired me, I did a grep and found in fastai-master/old/fastai/learner.py:
def fit_gen(self, model, data, layer_opt, n_cycle, cycle_len=None, cycle_mult=1, cycle_save_name=None, best_save_name=None,
use_clr=None, use_clr_beta=None, metrics=None, callbacks=None, use_wd_sched=False, norm_wds=False,
wds_sched_mult=None, use_swa=False, swa_start=1, swa_eval_freq=5, **kwargs):

    """Method does some preparation before finally delegating to the 'fit' method for
    fitting the model. Namely, if cycle_len is defined, it adds a 'Cosine Annealing'
    scheduler for varying the learning rate across iterations.

    Method also computes the total number of epochs to fit based on provided 'cycle_len',
    'cycle_mult', and 'n_cycle' parameters.

    Args:
        model (Learner):  Any neural architecture for solving a supported problem.
            Eg. ResNet-34, RNN_Learner etc.

        data (ModelData): An instance of ModelData.

        layer_opt (LayerOptimizer): An instance of the LayerOptimizer class

        n_cycle (int): number of cycles

        cycle_len (int):  number of epochs before lr is reset to the initial value.
            E.g if cycle_len = 3, then the lr is varied between a maximum
            and minimum value over 3 epochs.

        cycle_mult (int): additional parameter for influencing how the lr resets over
            the cycles. For an intuitive explanation, please see
            https://github.com/fastai/fastai/blob/master/courses/dl1/lesson1.ipynb

under the old folder, means this learner.py is not been used anymore.
now the fit() goes to fastai/basic_train.py

cycle_len appears in

fastai-master/courses/dl[12]/*.ipynb

but it doesn’t appear in

course-v3-master/nbs/*.ipynb

If learner.py isn’t used anymore, the easiest test would be to move it away and see if your notebook runs…

I believe old isn’t obsolete. I believe fastai-master/old/fastai/*.py are in use in Fast.ai (Clouderizer) and Fast.ai 0.7.x (Paperspace) – as opposed to fastai-v3 (Clouderizer) and Fast.ai 1.0 / PyTorch 1.0 BETA (Paperspace)

thank you very much, now I know, it is symbol link:
./courses/dl1/fastai -> …/…/old/fastai
so that old/fastai/learner.py is in use,
the not in use one is basic_train.py