Lesson 3 In-Class Discussion ✅

dont you define it as MSELossFlat(input, target)?

Sorry for the imprecision here. I mean the point of steepest decline.

1 Like

I’m just following what was done in the lesson. There were no arguments given.

I am getting this error when I try to run “Go Big” in camvid notebook, after training stage-1 and restarted kernel and ran datasets part. I am able to load stage-1 for stage-2 Go Big. I am able to run lr_find(learn) as well. When I run “learn.fit_one_cycle(10, slice(lr))” I get this errror. Any one got the same error or any help?

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-26-56c1c61458d6> in <module>()
----> 1 learn.fit_one_cycle(10, slice(lr))

~/anaconda3/envs/pytorch/lib/python3.7/site-packages/fastai/train.py in fit_one_cycle(learn, cyc_len, max_lr, moms, div_factor, pct_start, wd, callbacks, **kwargs)
    20     callbacks.append(OneCycleScheduler(learn, max_lr, moms=moms, div_factor=div_factor,
    21                                         pct_start=pct_start, **kwargs))
---> 22     learn.fit(cyc_len, max_lr, wd=wd, callbacks=callbacks)
    23 
    24 def lr_find(learn:Learner, start_lr:Floats=1e-7, end_lr:Floats=10, num_it:int=100, stop_div:bool=True, **kwargs:Any):

~/anaconda3/envs/pytorch/lib/python3.7/site-packages/fastai/basic_train.py in fit(self, epochs, lr, wd, callbacks)
    160         callbacks = [cb(self) for cb in self.callback_fns] + listify(callbacks)
    161         fit(epochs, self.model, self.loss_func, opt=self.opt, data=self.data, metrics=self.metrics,
--> 162             callbacks=self.callbacks+callbacks)
    163 
    164     def create_opt(self, lr:Floats, wd:Floats=0.)->None:

~/anaconda3/envs/pytorch/lib/python3.7/site-packages/fastai/basic_train.py in fit(epochs, model, loss_func, opt, data, callbacks, metrics)
    92     except Exception as e:
    93         exception = e
---> 94         raise e
    95     finally: cb_handler.on_train_end(exception)
    96 

~/anaconda3/envs/pytorch/lib/python3.7/site-packages/fastai/basic_train.py in fit(epochs, model, loss_func, opt, data, callbacks, metrics)
    87             if hasattr(data,'valid_dl') and data.valid_dl is not None:
    88                 val_loss = validate(model, data.valid_dl, loss_func=loss_func,
---> 89                                        cb_handler=cb_handler, pbar=pbar)
    90             else: val_loss=None
    91             if cb_handler.on_epoch_end(val_loss): break

~/anaconda3/envs/pytorch/lib/python3.7/site-packages/fastai/basic_train.py in validate(model, dl, loss_func, cb_handler, pbar, average, n_batch)
    52             if not is_listy(yb): yb = [yb]
    53             nums.append(yb[0].shape[0])
---> 54             if cb_handler and cb_handler.on_batch_end(val_losses[-1]): break
    55             if n_batch and (len(nums)>=n_batch): break
    56         nums = np.array(nums, dtype=np.float32)

~/anaconda3/envs/pytorch/lib/python3.7/site-packages/fastai/callback.py in on_batch_end(self, loss)
    236         "Handle end of processing one batch with `loss`."
    237         self.state_dict['last_loss'] = loss
--> 238         stop = np.any(self('batch_end', not self.state_dict['train']))
    239         if self.state_dict['train']:
    240             self.state_dict['iteration'] += 1

~/anaconda3/envs/pytorch/lib/python3.7/site-packages/fastai/callback.py in __call__(self, cb_name, call_mets, **kwargs)
    184     def __call__(self, cb_name, call_mets=True, **kwargs)->None:
    185         "Call through to all of the `CallbakHandler` functions."
--> 186         if call_mets: [getattr(met, f'on_{cb_name}')(**self.state_dict, **kwargs) for met in self.metrics]
    187         return [getattr(cb, f'on_{cb_name}')(**self.state_dict, **kwargs) for cb in self.callbacks]
    188 

~/anaconda3/envs/pytorch/lib/python3.7/site-packages/fastai/callback.py in <listcomp>(.0)
    184     def __call__(self, cb_name, call_mets=True, **kwargs)->None:
    185         "Call through to all of the `CallbakHandler` functions."
--> 186         if call_mets: [getattr(met, f'on_{cb_name}')(**self.state_dict, **kwargs) for met in self.metrics]
    187         return [getattr(cb, f'on_{cb_name}')(**self.state_dict, **kwargs) for cb in self.callbacks]
    188 

~/anaconda3/envs/pytorch/lib/python3.7/site-packages/fastai/callback.py in on_batch_end(self, last_output, last_target, train, **kwargs)
    269         if not is_listy(last_target): last_target=[last_target]
    270         self.count += last_target[0].size(0)
--> 271         self.val += last_target[0].size(0) * self.func(last_output, *last_target).detach().cpu()
    272 
    273     def on_epoch_end(self, **kwargs):

TypeError: 'module' object is not callable
1 Like

I am also getting same error

1 Like

Change learn to:
learn = Learner.create_unet(data, models.resnet34, metrics=metrics).to_fp16()

and try to run. It will work

1 Like

But filters are a fixed amount of pixels. So, for example, a filter detecting a black edge may just look like solid black when it is 4x bigger. Granted there are a range of filter sizes, and in photos things have to be learned at different zoom levels. But this won’t be the case for satellite images.

Since filter size not change only image size., When filter convolve with image, it detect particular wherever it exists in the image.

I still get the after appending to_fp16(). Do you remember doing something other than this as well?

Ok what I have done is closed the notebook at that time and after restarting it worked for me. But when I tried to see results it has given error. Now GCP is not working and showing resources of the zone are not free

1 Like

Hello all,

I am trying to get a handle on the data_block API. Don’t know what am i doing wrong.
working with a established dataset kaggle whale-categorization-playground
train and test folders contain images
train.csv contains Image/Id belonging to the train folder.

I reached

data = (ImageFileList.from_folder(path) # works
.label_from_csv(path/‘train.csv’, folder=‘train’) # works
.random_split_by_pct(0.2) # works
.datasets() #errors key error on first Id in the train.csv
.transform(tfms, size=128)
.databunch()
.normalize(imagenet_stats))

Can any of you gurus help?

Here is the full trace:

KeyError Traceback (most recent call last)
in ()
----> 1 d=c.datasets()

~/Documents/fastai/courses/v3/nbs/dl1/fastai/data_block.py in datasets(self, dataset_cls, **kwargs)
234 train = dataset_cls(*self.train.items.T, **kwargs)
235 dss = [train]
–> 236 dss += [train.new(*o.items.T, **kwargs) for o in self.lists[1:]]
237 cls = getattr(train, ‘splits_class’, self._pipe)
238 return cls(self.path, *dss)

~/Documents/fastai/courses/v3/nbs/dl1/fastai/data_block.py in (.0)
234 train = dataset_cls(*self.train.items.T, **kwargs)
235 dss = [train]
–> 236 dss += [train.new(*o.items.T, **kwargs) for o in self.lists[1:]]
237 cls = getattr(train, ‘splits_class’, self._pipe)
238 return cls(self.path, *dss)

~/Documents/fastai/courses/v3/nbs/dl1/fastai/vision/data.py in new(self, classes, *args, **kwargs)
80 def new(self, *args, classes:Optional[Collection[Any]]=None, **kwargs):
81 if classes is None: classes = self.classes
—> 82 return self.class(*args, classes=classes, **kwargs)
83
84 class ImageClassificationDataset(ImageClassificationBase):

~/Documents/fastai/courses/v3/nbs/dl1/fastai/vision/data.py in init(self, x, y, classes, **kwargs)
75 class ImageClassificationBase(ImageDatasetBase):
76 def init(self, x:Collection, y:Collection, classes:Collection=None, **kwargs):
—> 77 super().init(x=x, y=y, classes=classes, **kwargs)
78 self.learner_type = ClassificationLearner
79

~/Documents/fastai/courses/v3/nbs/dl1/fastai/vision/data.py in init(self, **kwargs)
67 class ImageDatasetBase(DatasetBase):
68 def init(self, **kwargs):
—> 69 super().init(**kwargs)
70 self.image_opener = open_image
71 self.learner_type = ImageLearner

~/Documents/fastai/courses/v3/nbs/dl1/fastai/basic_data.py in init(self, x, y, classes, c, task_type, class2idx, as_array, do_encode_y)
23 else: self.c = len(self.classes)
24 if class2idx is None: self.class2idx = {v:k for k,v in enumerate(self.classes)}
—> 25 if y is not None and do_encode_y: self.encode_y()
26 if self.task_type==TaskType.Regression: self.loss_func = MSELossFlat()
27 elif self.task_type==TaskType.Single: self.loss_func = F.cross_entropy

~/Documents/fastai/courses/v3/nbs/dl1/fastai/basic_data.py in encode_y(self)
30 def encode_y(self):
31 if self.task_type==TaskType.Single:
—> 32 self.y = np.array([self.class2idx[o] for o in self.y], dtype=np.int64)
33 elif self.task_type==TaskType.Multi:
34 self.y = [np.array([self.class2idx[o] for o in l], dtype=np.int64) for l in self.y]

~/Documents/fastai/courses/v3/nbs/dl1/fastai/basic_data.py in (.0)
30 def encode_y(self):
31 if self.task_type==TaskType.Single:
—> 32 self.y = np.array([self.class2idx[o] for o in self.y], dtype=np.int64)
33 elif self.task_type==TaskType.Multi:
34 self.y = [np.array([self.class2idx[o] for o in l], dtype=np.int64) for l in self.y]

KeyError: ‘w_e15442c’

I think you need to add something in the datasets() row. See the dataset options here: https://docs.fast.ai/vision.data.html

@alvisanovari, Bilal I reached out only after having tried out a bunch of things.

how to install 7zip in google colab

1 Like

I have written a brief blog post based on my notes for lesson 3. I felt this lesson was comparatively denser than previous 2 lessons, and more stuff was covered. So, there could be unintended mistakes Do have a look, and suggest if there are some errors/ missing points.

1 Like

Did you try an argument inside datsets() such as datasets(ImageDatasetBase)?

@alvisanovari, I wish things were so simple.

haha it never is. I can only imagine its something to do with your data because this works fine for me:

data = (ImageFileList.from_folder(path_img)
        .label_from_func(get_float_labels)
        .random_split_by_pct(valid_pct=0.2)
        .datasets()
        .transform(get_transforms(), size=224)
        .databunch(bs=bs).normalize(imagenet_stats)
       )
1 Like

While running the imdb notebook getting this error.

`NameError Traceback (most recent call last)
in
----> 1 path = untar_data(URLs.IMDB_SAMPLE)
2 path.ls()

NameError: name ‘untar_data’ is not defined
`

@sgugger @karan

Look at the dataset here
! kaggle competitions download -c whale-categorization-playground -p {path}