[Need Help] Some questions and errors I meet

If res50 defined like this:

def forward(self, x):
    x = self.conv1(x)
    x = self.bn1(x)
    x = self.relu(x)
    x = self.maxpool(x)

    x = self.layer1(x)
    x = self.layer2(x)
    x = self.layer3(x)
    x = self.layer4(x)

    x = self.avgpool(x)
    x = x.view(x.size(0), -1)
    x = self.fc(x)

    return x

I use
learn = create_cnn(
data,
resnet50,
ps=0.5,
cut=-2, #!!!
path=path,
metrics=[acc],
)

If senet154 defined like this:

def features(self, x):
    x = self.layer0(x)
    x = self.layer1(x)
    x = self.layer2(x)
    x = self.layer3(x)
    x = self.layer4(x)
    return x

def logits(self, x):
    x = self.avg_pool(x)
    x = self.dropout(x)
    x = x.view(x.size(0), -1)
    x = self.last_linear(x)
    return x

def forward(self, x):
    x = self.features(x)
    x = self.logits(x)
    return x

I use
learn = create_cnn(
data,
senet154,
ps=0.5,
cut=-3, #!!! because of “x = self.dropout(x)”
path=path,
metrics=[acc],
)

Is the use of “cut” right? If I need to find the self.avg_pool in every network and then set “cut”?


I use the senet154 from “fastai/old/fastai/models/senet.py” on fastai ‘1.0.22’.

bs=16
data = ImageDataBunch.create(train_ds, val_ds, test_ds=test_ds, path=path, bs=bs, tfms=(tfms, []), num_workers=8, size=512).normalize(kk)
learn = create_cnn(
data,
senet154,
ps=0.5,
cut=-2,
path=path,
metrics=[acc]
)
learn.model = nn.DataParallel(learn.model)
learn.callback_fns.append(partial(SaveModel, every=‘improvement’, monitor=‘val_loss’))
learn.fit_one_cycle(5, lrs)
After 1 epoch end, it gives:
“Expected more than 1 value per channel when training”…

I find if I set bs = bs -1, the error is solved. And I still don’t know how to use “cut”…

For your first error, make sure your dataloaders have drop_last=True as you had a last batch of size 1, which doesn’t work with BatchNorm during training.
The cut should be done just before the average pool (this one not included) so print the pytorch summary of your model and count the layers up to that.

2 Likes

Thanks!