Hey @Mantis, I was also facing the same issue and it’ll persist till the next release. However, it seems like the issue is fixed in fastai-1.0.61.dev0 and you can install the same to tackle this problem as of now. To do so,
(optionally)!curl -s https://course.fast.ai/setup/colab | bash
(Since this script does some environment setup for you which you might need to do by yourself.)
In the last days I’ve been experiencing a lot of issues with Colab - not related to fastai or any particular library, but in general cells take a lot of time to run, often get hung up, I need to restart or reconnect the kernels many times to get something done. Is it only my experience, or have other people noted this as well?
I wanted to ask one thing I am new to colab and was getting error in importing text_classification
import torch
import torchtext
from torchtext.datasets import text_classification
import os
NGRAMS = 2
if not os.path.isdir(’./.data’):
os.mkdir(’./.data’)
print(os.path)
train_dataset, test_dataset = text_classification.DATASETS[‘AG_NEWS’](
root=’./.data’, ngrams=NGRAMS, vocab=None)
BATCH_SIZE = 16
device = torch.device(“cuda” if torch.cuda.is_available() else “cpu”)
I was gettting error like ImportError: cannot import name ‘text_classification’
I think the Colab setup documentation (located at https://course.fast.ai/start_colab.html) might be missing a step. I don’t have a fastai-v3 folder in my Google drive.
After some deliberation we’ve determined that this method won’t break the TOS, so the directions are back up
FYI folks, @deeplearner and @imrandude both came up with a methodology of turning your colab instances into full Jupyter Environments. This is experimental and Colab may patch this at any moment. Please use at your own discretion. The directions are in the top post. If there are any install issues etc ping me and I’ll work on it.
Hi @muellerzr, thanks for posting the method to run jupyter instances out of colab. I am encountering some issues: the tunneling with ngrok and everything goes smooth, but when I open the notebook and try to run something no output happen (and I assume nothing is happening too).
The kernel is correctly running and everything is fine.
The only thing is that I am opening colab notebooks that I stored in gdrive, since it’s available in the root of jupyter file explorer.
I am using colab for image classification, however after mounting my g drive and untar the data. I am unable to find the data in my drive. What wrong could i be doing here.
Oh I got it. There’s nothing wrong, the data is being extracted to desired location, you just need to make a soft link to that location to be able to browse it. Execute the following line and you’ll be able to see the directory
Hi,
I am new to colab and facing an issue. Shouldn’t there be a folder called ‘fastai-v3’ created in my google drive after the authentication? But there is no such folder present.
Also, as I understand this should save the progress of my notebook and I should be able to pick up from the cell where I left off after starting a new session but I am not able to do so, the previous instance is wiped completetly. Is this correct or not?
Hi yashbansal6 hope you are having a beautiful day.
When you run the notebook nothing is saved permanently, anything you physicallly save to your drive will be saved everything else is just saved while you are using the session. When that session ends your are not able to start where you left off.
You must save the notebook as you go. Google colab normally times out after 90 minutes of no mouse activity and session itself is killed every 12 hours. As you stated the previous session is wiped out completely.
However when using a notebook I normally rename it and save it intermittently to my gdrive as I am working on it. I normally find if I leave the browser running and my account logged in. I can just reconnect and all the cells are there, I just need to re run them.
This process can be a bit tedious but its free, I believe they do a payed version in the states not sure if that works the same.
If anyone has long time required for training on Google Colab Check if you are connected to Hosted Runtime rather than Local Runtime. (On the top right).
Hello!
I was able to take the datasets I created for image segmentation and run the training using the unet model. Also I was able to load the model and get good predictions of the input images. My question is I am still unfamiliar with in depth concept of the loaded model. I run learn.load and learn.summary() to see the architecture. How am i able to determine the number of neurons, weights, biases, and layers used in UNET model that the fastai library offers.