Platform: Colab ✅

Cheers muellerzr I will learn checkpoints tomorrow its 4:30am here. Zzz
Many Thanks mrfabulous1 :smiley::smiley:

1 Like

Problem - Colab session timing out after 12 hours model requires 20 hours what is the solution?

To solve this problem I have created command using callbacks https://docs.fast.ai/callbacks.html To test it works I have used the multilabel example from https://docs.fast.ai/tutorial.data.html#A-multilabel-problem

learn = cnn_learner(data, models.resnet18, callback_fns=[CSVLogger])
learn.fit_one_cycle(30,1e-2, callbacks=[ShowGraph(learn), SaveModelCallback(learn, monitor=‘train_loss’, mode=‘min’, name=‘mini_train_30_best_model’)])

The output of the command above is shown below, the command saves a list of the epoch results to a csv file and a model is also saved to a file after every epoch.
image
image
image




How can I change the command above to stop at epoch 28 when the training loss is less than the validation loss ?

I have tried using other values such as error_rate but I get the following error, and am not sure how to change the command to achieve the result I require.

/anaconda/envs/fastai_uvicorn_0_7_1/lib/python3.6/site-packages/fastai/callbacks/tracker.py:50: UserWarning: <class ‘fastai.callbacks.tracker.SaveModelCallback’> conditioned on metric error_rate which is not available. Available metrics are: train_loss, valid_loss warn(f’{self.class} conditioned on metric {self.monitor} which is not available. Available metrics are: {", ".join(map(str, self.learn.recorder.names[1:-1]))}’)

Thanks in advance mrfabulous1 :smiley::smiley:

Typically it is better to save the model with the best valid_loss. Then, with early stopping, it seems like you would be done much earlier.

1 Like

Cheers ilovescience I changed the command to monitor valid loss and removed min and it now stops at epoch 13.

Thank you very much!

mrfabulous1 :smiley::smiley:

1 Like

Hi,
Need some help with lr_find function on Colab. I am trying to do lesson 2 of the fast ai course - https://course.fast.ai/videos/?lesson=2
This involves picking up urls of images from the internet, and running a cnn to categorize it.

I picked 3 items - forks ladles and spoons (attached). I ran the download images, stored and verified them. This is all fine.

Ran fit one cycle cnn through it, which gave some numbers.

The next steps ask you to unfreeze the model, and run a learning rate finder through it. Here is where I get stuck. Instead of giving me some numbers and a graph it gives me #na#.

I’ve tried below with start and end lr.

I’ve also removed those parameters, and tried just lr_find()
image

Please help :slight_smile:

1 Like

It’s working fine. Answer here:

1 Like

You should do learn.recorder.plot() as well to see the graph

1 Like

Thanks a bunch @ilovescience! I’ll try again.

So I finished setup for colab successfully but the Practical Deep Learning for Coders, v3 says that after setting up I should see a Jupyter notebook like this

But I don’t. Am I missing something?

Hi maverick891 hope your having lots of fun today!

Your notebook will look like this:

You then have to do file > open notebook and you will get the following screen.

As far as I am aware I have I only see the screen you have shown when I run fastai on my desktop.

This is the link I have used for all the lessons in fastai part 1 V3.

https://course.fast.ai/start_colab.html#step-4-saving-your-data-files

Cheers
mrfabulous1 :smiley::smiley:

Hi everyone

I am a Data Science intern from a non Computer Science background so apologies for the noonish questions.

Does anyone have a detailed tutorial on how to follow the download.ipynb notebook using Google Colab??

I’m trying to follow the notebook and after I’ve entered the given JavaScript code snippet in the first step, the console returns “null” not too sure if that’s supposed to happen?

And even if I am on the right track I’m not sure where the images are stored with the regards to using/accessing them in Colab since it’s a different server.

Any help would be greatly appreciated.

Thanks

There is an easier way to download google images using the google-images-download python module in your jupyter notebook.
Here are the steps to download the different bears images

First install the python module. In a cell, enter the following command
!pip install google_images_download
Then in the subsequent cells, start downloadig the grizzly bears (-k grizzly argument below)
!googleimagesdownload -k grizzly -l 100
A new folder called downloads/grizzly will be created with 100 images in it (that’s the -100 argument n the command above)
Do the same thing for the other categories
!googleimagesdownload -k ‘black bear’ -l 100
!googleimagesdownload -k ‘teddy bear’ -l 100

Run the ls command
!ls downloads/
and you should have the 3 following folders
‘black bear’ grizzly ‘teddy bear’

If you are using the lesson2-download, you can update your path like this (assuming you are using Colab)
classes = [‘grizzly’, ‘black bear’, ‘teddy bear’]
path = Path(’/content/downloads’)
path.ls()

The cell output should look like this
[PosixPath(’/content/downloads/grizzly’),
PosixPath(’/content/downloads/black bear’),
PosixPath(’/content/downloads/teddy bear’)]

From there, you can just continue executing the rest of the cells.

If you want to keep the original folder names (used in lesson2-download.ipynb) you can rename the new folder names listed here above to match those of lesson2-download.ipynb notebook.

Cheers

1 Like

Dude Thank You so much! The help is much appreciated man :pray:t4::pray:t4::pray:t4:

My pleasure :slightly_smiling_face:. I’m Glad it was helpful

Good luck!

Hey guys

Anybody find a tutorial for Lesson 3 - using Kaggle API in Google Colab that’s an up to date tutorial for 2019? I tried the one listed under the lecture resources but it’s supposedly outdated.

Thanks

You can follow these steps:

First, download your kaggle.json file to Colab.

Then run these command in your Colab notebook:

! pip install kaggle --upgrade
! mkdir -p ~/.kaggle/
! cp /content/kaggle.json ~/.kaggle/
! chmod 600 /root/.kaggle/kaggle.json
path = Config.data_path()/'planet'
path.mkdir(parents=True, exist_ok=True)
! kaggle competitions download -c planet-understanding-the-amazon-from-space -f train-jpg.tar.7z -p {path}

Then run the rest of the notebook commands

Hey Farid

As usual, thanks for the help man. The following block:

path = Config.data_path()/'planet'

gives me a:
NameError: name 'Config' is not defined

you have to run this before:

from fastai.vision import *

I’m assuming that you are following along with the lesson3-planet notebook

oohh my bad. thanks my Canadian brotha :pray:t4:

No problem!