Platform: Colab ✅

Hey @Mantis, I was also facing the same issue and it’ll persist till the next release. However, it seems like the issue is fixed in fastai-1.0.61.dev0 and you can install the same to tackle this problem as of now. To do so,

  1. (optionally) !curl -s https://course.fast.ai/setup/colab | bash
    (Since this script does some environment setup for you which you might need to do by yourself.)
  2. !pip uninstall fastai fastprogress
  3. !pip install git+https://github.com/fastai/fastai.git
  4. Restart the runtime for once. (not reset)

and you’re good to go. You can check current fastai version by executing

from fastai.vision import *
print(__version__)

In the last days I’ve been experiencing a lot of issues with Colab - not related to fastai or any particular library, but in general cells take a lot of time to run, often get hung up, I need to restart or reconnect the kernels many times to get something done. Is it only my experience, or have other people noted this as well?

I’ve noticed the hanging a few times. But Maybe once. Do you have Pro or the free version @darek.kleczek?

I’m using free version - is it better with Pro?

I got pro and I can’t complain. I’ve had mabye one instance where I had a local hang up/crash of some form but otherwise smooth as butter.

I wanted to ask one thing I am new to colab and was getting error in importing text_classification
import torch
import torchtext
from torchtext.datasets import text_classification
import os

NGRAMS = 2

if not os.path.isdir(’./.data’):
os.mkdir(’./.data’)
print(os.path)
train_dataset, test_dataset = text_classification.DATASETS[‘AG_NEWS’](
root=’./.data’, ngrams=NGRAMS, vocab=None)

BATCH_SIZE = 16

device = torch.device(“cuda” if torch.cuda.is_available() else “cpu”)

I was gettting error like ImportError: cannot import name ‘text_classification’

Can anybody help on this ?
Thanks

I think the Colab setup documentation (located at https://course.fast.ai/start_colab.html) might be missing a step. I don’t have a fastai-v3 folder in my Google drive.

How should I proceed?

Hi, you can use my notebook to run lesson3.
https://colab.research.google.com/drive/1VgLDzrIV3aR4GA7fk-tN2KOuF_ll2Ei2

I think you problem is missing the first code in my notebook.

After some deliberation we’ve determined that this method won’t break the TOS, so the directions are back up

FYI folks, @deeplearner and @imrandude both came up with a methodology of turning your colab instances into full Jupyter Environments. This is experimental and Colab may patch this at any moment. Please use at your own discretion. The directions are in the top post. If there are any install issues etc ping me and I’ll work on it.

This means that any widgets will work here!

1 Like

Hi @muellerzr, thanks for posting the method to run jupyter instances out of colab. I am encountering some issues: the tunneling with ngrok and everything goes smooth, but when I open the notebook and try to run something no output happen (and I assume nothing is happening too).
The kernel is correctly running and everything is fine.

The only thing is that I am opening colab notebooks that I stored in gdrive, since it’s available in the root of jupyter file explorer.

Hi,

I am using colab for image classification, however after mounting my g drive and untar the data. I am unable to find the data in my drive. What wrong could i be doing here.

Regards
Sekar

Could you be more specific? :slightly_smiling_face:

Are you trying to extract a tar file from your gdrive to colab?

Yes when i try to untar_data of URLs.PETS in path variable.

And print path i get the following output <bound method of PosixPath(’/root/.fastai/data/oxford-iiit-pet’)>

Not sure how to proceed further

Oh I got it. There’s nothing wrong, the data is being extracted to desired location, you just need to make a soft link to that location to be able to browse it. Execute the following line and you’ll be able to see the directory :slightly_smiling_face:

!ln -s /root/.fastai/data /content
2 Likes

Hi,
I am new to colab and facing an issue. Shouldn’t there be a folder called ‘fastai-v3’ created in my google drive after the authentication? But there is no such folder present.

Also, as I understand this should save the progress of my notebook and I should be able to pick up from the cell where I left off after starting a new session but I am not able to do so, the previous instance is wiped completetly. Is this correct or not?

Hi yashbansal6 hope you are having a beautiful day.

When you run the notebook nothing is saved permanently, anything you physicallly save to your drive will be saved everything else is just saved while you are using the session. When that session ends your are not able to start where you left off.

You must save the notebook as you go. Google colab normally times out after 90 minutes of no mouse activity and session itself is killed every 12 hours. As you stated the previous session is wiped out completely.

However when using a notebook I normally rename it and save it intermittently to my gdrive as I am working on it. I normally find if I leave the browser running and my account logged in. I can just reconnect and all the cells are there, I just need to re run them.

This process can be a bit tedious but its free, I believe they do a payed version in the states not sure if that works the same.

Cheers mrfabulous1 :smiley: :smiley:

If anyone has long time required for training on Google Colab Check if you are connected to Hosted Runtime rather than Local Runtime. (On the top right).

to be able to use " / ", it need to be an instance of Path. So simply wrap your path like so:

path = Path("/content/drive/My Drive/Umes")

this will make all the subsequent variables of type Path if you use the “/” operator. So update path_img and path_lbl to:

path_img = path/'Images'
path_lbl = path/'labels

The main issue was path_lbl not being an instance of Path.

1 Like

Hello!
I was able to take the datasets I created for image segmentation and run the training using the unet model. Also I was able to load the model and get good predictions of the input images. My question is I am still unfamiliar with in depth concept of the loaded model. I run learn.load and learn.summary() to see the architecture. How am i able to determine the number of neurons, weights, biases, and layers used in UNET model that the fastai library offers.

1 Like

I’m not sure if you’re talking about fastai v1 or v2 but in general:

  • Try to understand DynamicUnet Module in fastai
  • Learn how PyTorch hooks are used to build the decoder part of Unet
  • A simple exercise would be, modify Unet to AutoEncoder (no skip connections)
  • You can customize backbone of the network, so try different flavors of Resnet

docs would be an ideal place to learn more about them.
fastai v1: https://docs.fast.ai/vision.models.unet.html
fastai v2: https://dev.fast.ai/vision.models.unet