In class LR_Finder, what is init_lrs, is it different from layer_opt.lr?

In class LR_Finder, what is init_lrs, is it different from layer_opt.lr?

class LR_Finder(LR_Updater):
        '''
        Helps you find an optimal learning rate for a model, as per suggetion of 2015 CLR paper. 
        Learning rate is increased in linear or log scale, depending on user input, and the result of the loss funciton is retained and can be plotted later. 
        '''
        def __init__(self, layer_opt, nb, end_lr=10, linear=False, metrics = []):
            self.linear, self.stop_dv = linear, True
            ratio = end_lr/layer_opt.lr
            self.lr_mult = (ratio/nb) if linear else ratio**(1/nb)
            super().__init__(layer_opt,metrics=metrics)

        def on_train_begin(self):
            super().on_train_begin()
            self.best=1e9

        def calc_lr(self, init_lrs):
            mult = self.lr_mult*self.iteration if self.linear else self.lr_mult**self.iteration
            return init_lrs * mult

        def on_batch_end(self, metrics):
            loss = metrics[0] if isinstance(metrics,list) else metrics
            if self.stop_dv and (math.isnan(loss) or loss>self.best*4):
                return True
            if (loss<self.best and self.iteration>10): self.best=loss
            return super().on_batch_end(metrics)
  • init_lrs are the initial learning rates,
  • layer_opt.lr is the current learning rate of model’s last layer,
  • layer_opt.lrs are current learning rates of all model’s layers.

Basically, you initialize model with init_lrs which change over time (that change is visible in layer_opt.lr and layer_opt.lrs properties).

1 Like