Platform: Kaggle Kernels

Hey all!

I am getting errors with the lesson 3 planets kernel, could someone help me with the updated commands? I think ImageItemList got removed.

Hi. When I was doing the second lesson I wanted to try to run the bear recognition model on my machine. As they say in the video itā€™s better to learn the neural network with GPU but after model is ready I can use a CPU for image recognition. I saved a model to a file in a Kaggle kernel but I canā€™t find a way of downloading it. Is there a good way to get files from kernel or itā€™d be easier to do it in different environment?

I am sorry I did not get back to you. I did not see this response until now. Did you get this to work?

if you export the file to the main directory, it should come up in the output tab of the kernel where you can download it

The only directory I see in the tab (Workspace section) is the input directory that located at /kaggle/input. The directory is in RO so I canā€™t move a file there. I tried to place a file in /, /kaggle etc. but canā€™t make this file to appear in the output tab.

try placing the file in ../

hi, you might wanna read this thread: ImageItemList not defined

Hey @yappo , first of all welcome to the community. I usually save my models using learn.save("/kaggle/working/NameOfModel")
If you want to download them without committing you can refer to this

1 Like

Hi Dipam

I used these codes -

learn.save("/kaggle/working/devanagari")
OR
learn.save("/kaggle/working/devanagari.pkl")

And when I am using below code to see and download the files, its throwing me 404 error-

from IPython.display import FileLinks
FileLinks(ā€™/kaggle/working/ā€™) # input argument is specified folder

Can you please help?

Hey, sorry for the late response. Did you get your answer? Can you try FileLinks(ā€™.ā€™) i.e the root directory? I think that works. Otherwise Iā€™ll check it once I get time. The input argument for learn.save() is for when you want to download models after committing. They appear in the output tab.

Hi again, thanks everyone for the quick answers. Still trying to figure out how to load a pretrained model Platform: Kaggle Kernels . What I did is I trained a model on Kaggle, saved it and then downloaded a .pth file. Then I run fastai on my commodity machine and wanted to load the model. But then I found out that to load a saved model I have to create a Learner object first. And to do that I have to specify a data bunch object. There is no problem to do this but the question is why do I need the data I trained the model on to build a learner from a saved model? Does a model uses this data in some way to evaluate samples? Should the data for loading and for training be exactly the same?

Something is off with lesson fast-ai-v3-lesson-3-planet.
There is a snippet

np.random.seed(42)
src = (ImageItemList.from_csv(path, 'train_v2.csv', folder='train-jpg', suffix='.jpg')
       .random_split_by_pct(0.2)
       .label_from_df(sep=' '))

I found out ImageItemList was changed to ImageList so it became

np.random.seed(42)
src = (ImageList.from_csv(path=path, csv_name='/kaggle/input/train_v2.csv', folder='train-jpg', suffix='.jpg')
       .random_split_by_pct(0.2))

But I ecountered a proble on the next line. When I try to run

data = (src.transform(tfms, size=128)
        .databunch(num_workers=0).normalize(imagenet_stats))

I get an error

/opt/conda/lib/python3.6/site-packages/fastai/data_block.py in transform(self, tfms, **kwargs)
    491         if not tfms: tfms=(None,None)
    492         assert is_listy(tfms) and len(tfms) == 2, "Please pass a list of two lists of transforms (train and valid)."
--> 493         self.train.transform(tfms[0], **kwargs)
    494         self.valid.transform(tfms[1], **kwargs)
    495         if self.test: self.test.transform(tfms[1], **kwargs)

AttributeError: 'ImageList' object has no attribute 'transform'

So my internal attributes of src object are not exactly what is expected. Donā€™t have a clue how it should be done then. Has anybody figured it out?

Itā€™s not the first time I have problems with outdated notebooks. I wonder whatā€™s the situation on other platforms. AFAIK Gradient is an official platform for fast.ai. Has anyone tried it and what can you say about it? Does it worth switching to Gradient?

2 Likes

Hey, Iā€™m not sure why you are getting this error but it might be due to the fact that you are not following all the steps of creating a databunch and in the order they have to be in. More specifically, after your random split method, you should be telling it how to label your data.
The order goes as follows:

  1. Where to find the data?
  2. How to split it?
  3. How to label it?
  4. Optionally add a test folder
  5. Data augmentation and create a data bunch.

Let me know if you are able to solve the problem with this or in any other way.
Cheers

1 Like

I have problem trying to save a file. When I run:

data.save("/kaggle/working/jigsaw.pkl")

The Kernel dies every time.

try doing it without the extension. Just data.save("/kaggle/working.jigsaw").

Then I get this error:

FileNotFoundError: [Errno 2] No such file or directory: '../input/../kaggle/working/jigsaw'

Another question, is it possible to do transfer learning with Kaggle and fastai? When I try to run the learner:

learn = text_classifier_learner(data_cl, arch=AWD_LSTM)

I get an error:

OSError: [Errno 30] Read-only file system: '../input/models'

Iā€™m connected to the internet, but Iā€™m assuming it wonā€™t download the pre-trained model since it canā€™t save to the input folder.

You need to pass another parameter to your learner. After, arch=AWD_LSTM, just add model_dir="/tmp/model/" and things will work fine.

can you share the full code for this? You can also check my Kaggle. Iā€™ve been running all codes on Kaggle without any errors. Some kernels wonā€™t be up to date though.

Sorry folks my thread was at ā€œTrackingā€ instead of ā€œwatchingā€ so I missed out on the issues.

Iā€™ve fixed that, for further issues, please tag me directly: @init_27 and share any issues that you might be facing. (Iā€™m maintaining the kernels)

Iā€™ll update all of the kernels that are having issues this weekend, apologies for not keeping an eye out.

1 Like