lesson7-CAM: mismatch between # of lrs and of layer groups

It seems that the library continues to change w/o updating the notebooks to reflect the changes.

courses/dl1/lesson7-CAM.ipynb: last section: Model

learn.unfreeze()
learn.bn_freeze(True)
lr=np.array([1e-6,1e-4,1e-2])
learn.fit(lr, 2, cycle_len=1)

/mnt/disc1/fast.ai/fastai/courses/dl1/fastai/layer_optimizer.py in opt_params(self)
     17 
     18     def opt_params(self):
---> 19         assert(len(self.layer_groups) == len(self.lrs))
     20         assert(len(self.layer_groups) == len(self.wds))
     21         params = list(zip(self.layer_groups,self.lrs,self.wds))

AssertionError: 

There is no assert message. Digging into it. lrs is of size 3 and the layer_groups is of size 12.

core.py's listify is supposed to expand lrs to match the # of layer groups:

def listify(x, y):
    if not is_iter(x): x=[x]
    n = y if type(y)==int else len(y)
    if len(x)==1: x = x * n
    return x

only expands lrs to match the number of layer groups if it’s of size 1. So how is this code supposed to work?
3 against 12

This works with a different NN with exactly 3 layer_groups used in lesson2-image_models.ipynb:

lrs = np.array([lr/9, lr/3, lr])
learn.unfreeze()
learn.fit(lrs, 3, cycle_len=1, cycle_mult=2)

So this looks like either a bug in the fastai library, or the notebook. If this is by design then the notebook needs to be changed to do something like:

learn.unfreeze()
learn.bn_freeze(True)
lr=np.array([[1e-6]*4,[1e-4]*4,[1e-2]*4]).flatten()
learn.fit(lr, 2, cycle_len=1)

and then it works. But somehow I thought fastai was supposed to magically broadcast smaller groups onto bigger ones.

4 Likes

Hi Sats:

I thought it was CPU issue without using GPU. Now it is great and works. Thank you so much.

Xin

1 Like

I am facing the exact same issue. Using 12 lrs as stated by you works, but would be great to understand how it is supposed to be correctly used.