Changes in size of dataset make learner unrunable

When I try to run lesson 3 segmentation on my own data(Salt Challenge), it works fine for :

size = src_size//2 # do 1/2 image size
bs=8
codes=array([‘salt’, ‘sediment’], dtype=’<U17’)

But when I chance to size = src_size//4 or anything else I get an error like this saying when I run lr_find() :

 return F.cross_entropy(input, target, weight=self.weight,
--> 867                                ignore_index=self.ignore_index, reduction=self.reduction)
    868 
    869 

~/anaconda3/envs/fastai/lib/python3.7/site-packages/torch/nn/functional.py in cross_entropy(input, target, weight, size_average, ignore_index, reduce, reduction)
   1665     if size_average is not None or reduce is not None:
   1666         reduction = _Reduction.legacy_get_string(size_average, reduce)
-> 1667     return nll_loss(log_softmax(input, 1), target, weight, None, ignore_index, None, reduction)
   1668 
   1669 

~/anaconda3/envs/fastai/lib/python3.7/site-packages/torch/nn/functional.py in nll_loss(input, target, weight, size_average, ignore_index, reduce, reduction)
   1531         if target.size()[1:] != input.size()[2:]:
   1532             raise ValueError('Expected target size {}, got {}'.format(
-> 1533                 out_size, target.size()))
   1534         input = input.contiguous().view(n, c, 1, -1)
   1535         target = target.contiguous().view(n, 1, -1)

ValueError: Expected target size (8, 10404), got torch.Size([8, 10201])

When when I pass original size I get:
    `ValueError: Expected target size (8, 10404), got torch.Size([8, 10201])`

for any cases with different numbers at the end, I think some images are being left out or something so number images and masks don’t match. But even in the original somehow…
How to fix this?

Edit: I remember this worked just fine when I ran it a day after the lecture. It is definitely something in the upgrade causing this.

2 Likes

I just had the same issue today.

same here

I am also facing the same error.

how do you get codes=array([‘salt’, ‘sediment’], dtype=’<U17’) to work, my mask has 0 or 255. i have to have a 256 element array with salt as 0 and sediment as 255 for learning rate finder to work and my training ends up with nans :frowning:

You have to pass in div=True so it divides by 255. There is another thread for this: https://forums.fast.ai/t/unet-binary-segmentation/29833/31

1 Like