Platform: Kaggle Kernels

Hi. When I was doing the second lesson I wanted to try to run the bear recognition model on my machine. As they say in the video it’s better to learn the neural network with GPU but after model is ready I can use a CPU for image recognition. I saved a model to a file in a Kaggle kernel but I can’t find a way of downloading it. Is there a good way to get files from kernel or it’d be easier to do it in different environment?

I am sorry I did not get back to you. I did not see this response until now. Did you get this to work?

if you export the file to the main directory, it should come up in the output tab of the kernel where you can download it

The only directory I see in the tab (Workspace section) is the input directory that located at /kaggle/input. The directory is in RO so I can’t move a file there. I tried to place a file in /, /kaggle etc. but can’t make this file to appear in the output tab.

try placing the file in ../

hi, you might wanna read this thread: ImageItemList not defined

Hey @yappo , first of all welcome to the community. I usually save my models using learn.save("/kaggle/working/NameOfModel")
If you want to download them without committing you can refer to this

1 Like

Hi Dipam

I used these codes -

learn.save("/kaggle/working/devanagari")
OR
learn.save("/kaggle/working/devanagari.pkl")

And when I am using below code to see and download the files, its throwing me 404 error-

from IPython.display import FileLinks
FileLinks(’/kaggle/working/’) # input argument is specified folder

Can you please help?

Hey, sorry for the late response. Did you get your answer? Can you try FileLinks(’.’) i.e the root directory? I think that works. Otherwise I’ll check it once I get time. The input argument for learn.save() is for when you want to download models after committing. They appear in the output tab.

Hi again, thanks everyone for the quick answers. Still trying to figure out how to load a pretrained model Platform: Kaggle Kernels . What I did is I trained a model on Kaggle, saved it and then downloaded a .pth file. Then I run fastai on my commodity machine and wanted to load the model. But then I found out that to load a saved model I have to create a Learner object first. And to do that I have to specify a data bunch object. There is no problem to do this but the question is why do I need the data I trained the model on to build a learner from a saved model? Does a model uses this data in some way to evaluate samples? Should the data for loading and for training be exactly the same?

Something is off with lesson fast-ai-v3-lesson-3-planet.
There is a snippet

np.random.seed(42)
src = (ImageItemList.from_csv(path, 'train_v2.csv', folder='train-jpg', suffix='.jpg')
       .random_split_by_pct(0.2)
       .label_from_df(sep=' '))

I found out ImageItemList was changed to ImageList so it became

np.random.seed(42)
src = (ImageList.from_csv(path=path, csv_name='/kaggle/input/train_v2.csv', folder='train-jpg', suffix='.jpg')
       .random_split_by_pct(0.2))

But I ecountered a proble on the next line. When I try to run

data = (src.transform(tfms, size=128)
        .databunch(num_workers=0).normalize(imagenet_stats))

I get an error

/opt/conda/lib/python3.6/site-packages/fastai/data_block.py in transform(self, tfms, **kwargs)
    491         if not tfms: tfms=(None,None)
    492         assert is_listy(tfms) and len(tfms) == 2, "Please pass a list of two lists of transforms (train and valid)."
--> 493         self.train.transform(tfms[0], **kwargs)
    494         self.valid.transform(tfms[1], **kwargs)
    495         if self.test: self.test.transform(tfms[1], **kwargs)

AttributeError: 'ImageList' object has no attribute 'transform'

So my internal attributes of src object are not exactly what is expected. Don’t have a clue how it should be done then. Has anybody figured it out?

It’s not the first time I have problems with outdated notebooks. I wonder what’s the situation on other platforms. AFAIK Gradient is an official platform for fast.ai. Has anyone tried it and what can you say about it? Does it worth switching to Gradient?

2 Likes

Hey, I’m not sure why you are getting this error but it might be due to the fact that you are not following all the steps of creating a databunch and in the order they have to be in. More specifically, after your random split method, you should be telling it how to label your data.
The order goes as follows:

  1. Where to find the data?
  2. How to split it?
  3. How to label it?
  4. Optionally add a test folder
  5. Data augmentation and create a data bunch.

Let me know if you are able to solve the problem with this or in any other way.
Cheers

1 Like

I have problem trying to save a file. When I run:

data.save("/kaggle/working/jigsaw.pkl")

The Kernel dies every time.

try doing it without the extension. Just data.save("/kaggle/working.jigsaw").

Then I get this error:

FileNotFoundError: [Errno 2] No such file or directory: '../input/../kaggle/working/jigsaw'

Another question, is it possible to do transfer learning with Kaggle and fastai? When I try to run the learner:

learn = text_classifier_learner(data_cl, arch=AWD_LSTM)

I get an error:

OSError: [Errno 30] Read-only file system: '../input/models'

I’m connected to the internet, but I’m assuming it won’t download the pre-trained model since it can’t save to the input folder.

You need to pass another parameter to your learner. After, arch=AWD_LSTM, just add model_dir="/tmp/model/" and things will work fine.

can you share the full code for this? You can also check my Kaggle. I’ve been running all codes on Kaggle without any errors. Some kernels won’t be up to date though.

Sorry folks my thread was at “Tracking” instead of “watching” so I missed out on the issues.

I’ve fixed that, for further issues, please tag me directly: @init_27 and share any issues that you might be facing. (I’m maintaining the kernels)

I’ll update all of the kernels that are having issues this weekend, apologies for not keeping an eye out.

1 Like

Hi @init_27, I just uploaded IMDB Wiki Face dataset (from https://data.vision.ee.ethz.ch/cvl/rrothe/imdb-wiki/) into Kaggle. I can very well see the folders and files in the “Your Dataset” section on Kaggle.

However, when I am using this data into one of my kernels, I am getting errors.

Here are few screen shots:

  1. When I see this data into my “Your Dataset” folder:
    image

  2. In the Kaggle Kernel:
    image

  3. It is visible in the kernel as well:
    image

  4. But when I am extracting it with this code:
    image

I am getting below error:
image

I tried all tricks but nothing seemed to be working. Could you please help me in this regard.

Best Regards
Abhik