Arch vgg16 vs arch vgg16_bn

I noticed that when I try to use

arch=vgg16_bn

I get an error:

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-9-0e8771e4fad3> in <module>()
----> 1 learn = ConvLearner.pretrained(arch, data, precompute=True)

~/fastaip1v2/fastai/courses/dl1/fastai/conv_learner.py in pretrained(self, f, data, ps, xtra_fc, xtra_cut, **kwargs)
     89     @classmethod
     90     def pretrained(self, f, data, ps=None, xtra_fc=None, xtra_cut=0, **kwargs):
---> 91         models = ConvnetBuilder(f, data.c, data.is_multi, data.is_reg, ps=ps, xtra_fc=xtra_fc, xtra_cut=xtra_cut)
     92         return self(data, models, **kwargs)
     93 

~/fastaip1v2/fastai/courses/dl1/fastai/conv_learner.py in __init__(self, f, c, is_multi, is_reg, ps, xtra_fc, xtra_cut)
     31         self.xtra_fc = xtra_fc or [512]
     32 
---> 33         cut,self.lr_cut = model_meta[f]
     34         cut-=xtra_cut
     35         layers = cut_model(f(True), cut)

KeyError: <function vgg16_bn at 0x7fa52e8d3730>

But when I used arch = vgg16 it actually pulls in the vgg16_bn file so my question is: is vgg16 actually vgg16_bn and if so, what is vgg16_bn?

One theory I have is that VGG16 is directly from PyTorch and it actually uses VGG16 with batch normalization and VGG16_bn is maybe not fully developed yet, but it is something that Jeremy is developing still?

It looks like the issue is because vgg16_bn isn’t in this list:

model_meta = {
    resnet18:[8,6], resnet34:[8,6], resnet50:[8,6], resnet101:[8,6], vgg16: [0,22],
    resnext50:[8,6], resnext101:[8,6], resnext101_64:[8,6],
    wrn:[8,6], inceptionresnet_2:[-2,9], inception_4:[-1,9],
    dn121:[0,6], dn161:[0,6], dn169:[0,6], dn201:[0,6],
}

I am still digging into what these actually mean, but I’m guessing vgg16 is actually vgg16_bn and that’s why that one isn’t explicitly in the list.

Yeah vgg_bn is much better, so I figured we’d just use that, and ignore the non-bn version. It adds batchnorm to VGG.

3 Likes

Thanks, I also might have found an issue with dn121, have you had any issues with that?

I submitted a pull request with the fix I think will take care of it, but I don’t really know if it’s correct I just know it worked.

Thanks - I must have moved the *2 at some point…