Lesson 1 In-Class Discussion ✅

I was just wondering the same thing. That would be very useful for building out the model without having to create a while new folder structure for it.

I also encountered the same FWIW.

I tried to run a duplicate copy of lesson1 jupyter notebook and initially it ran with no problem. But when i tried to rerun it, path.ls() gave me only ['images'] as o/p. I restarted the server, tried on the original notebook also but I’m getting the same o/p. Since I downloaded the dataset twice maybe that might be causing this. But I don’t know how to access ‘/storage/oxford-iiit-pet’ on cloud and delete the multiple downloads. Any idea on this?

I seem to getting this error regarding using TTA. ImageClassificationDataset has not attribute ‘tfms’ .

Please I am trying to get the total number of training images as well as the number in each class.
How do I do that?

Maybe you didn’t include any transforms in your DataBunch?

Can you provide a link? Thanks!

You are right, I made a typo. It didn’t raise any error.

So if I’m understanding the one-cycle policy correctly, when using discriminative learning rates in conjunction with it, what we are doing is specifying the max LR for each layer group. Is that correct?

Where are models from learn.save() and dataset from untar_data saved by default ? I tried searching through ls -h, they aren’t visible.

@prajjwal you can check the path define in config file of .fastai folder. .fastai folder will be in your home directory.

Hey, did you get it working? Did you try reinstalling the library?

They get saved in local dir. They get stored within your {path}/models folder with extension ‘.pth’. These are kind of hidden.
Do
ls *.pth
inside your local dir.
Models are saved using Pytorch’s default serialization method.

Data is here ~/.fastai/data folder

2 Likes
Summary

This text will be hidden

@lesscomfortable can you help with this

1 Like

I literally learnt python today, so guys please suggest a better way.This is how I did it:

num_train=len(data.train_ds)
trainClasses = []
for i in range(num_train):
    trainClasses.append(data.train_ds.ds[i][1])
from collections import Counter
Counter(trainClasses) 
2 Likes

Well done! @kofi, a somewhat easier way for the number of classes:

num_classes = len(data.classes) (courtesy of @marcmuc)

I got the training set length in the same way:

num_train=len(data.train_ds)

Thanks @lesscomfortable, I appreciate it.

1 Like

Drop in to a terminal New > Terminal , or just use ! ls ... to run bash commands via notebook.

That is strange indeed. AFAIK, the model does not increase during training, so there is no reason to be a memory issue. Also, you report that it stalls, but there are no errors. I haven’t come with this issue before.

I suggest you either create a separated forum post or a repo issue to ask about this. In this thread it will get lost in noise. If you can share your notebook and datasets it will help others check the problem.