Colaboratory and Fastai



The notebook you provided on this website will not run for me in Colaboratory because it is read-only.

While I am in Colaboratory I tried the method below to create a new notebook by coping the notebook you provided:

I Select all cells in command mode (Ctrl+Shift+A)

I use Ctrl+C to copy all cells.

But when I try to Open new notebook (New Python 3 notebook) if fails with this error message:

Notebook loading error
There was an error loading this notebook. Ensure that the file is accessible and try again.

[object Object]
Error: [object Object]
at d (
at (
at b (

I can not open a new notebook because it fails with the error above.

What else can I try?





Each time I tried to open a new notebook in the Chrome browser it failed.

Then I tried to open a new notebook in the Firefox browser and it worked. I was then able to open a copy of the notebook you provided (I modified the permissions of the copy), run lesson1 and save it.




Now to use the chrome browser. I clicked the upper right menu icon then selected New incognito window and was able to open a new notebook in the Chrome browser, then I select Open Drive notebook.


I can run Lesson #1 on Crestle with no problem but when I run the same notebook on Google Colab after installing the relevant libraries and downloading the relevant data I get this error message. As a temporary workaround I made the following revision and now the script runs until the end:

log_preds,y = learn.TTA()
probs = np.mean(np.exp(log_preds),0)
#begin extra code
probs = log_preds
def accuracy_np(preds, targs):
preds = np.argmax(preds, 1)
return (preds==targs).mean()
#end extra code
accuracy_np(probs, y)

My question is if anybody here happens to have a version of Lesson #1 that is able to run in full on Google Colab at the time of this posting? Did you have to make a similar modification to the code? If so, I would appreciate if you could share your entire .ipynb file so that I can compare my solution (see github link) to yours.


Lesson 1: name 'accuracy_np' is not defined
(ecdrid) #65

I had done so but it isn’t with the latest version per se (it’s equivalent to 5 Months old but its working seamless just as Jeremy has shown as)

And it’s not at all slow…

Actually it’s faster than AWS actually…(might be lucky)

Except Memory Issues…


Thank you for sharing your .ipynb file @ecdrid. I get a different error with your notebook as compared to my notebook, however. Maybe there are compatibility issues between fastAI and Google Colab? The unmodified Lesson #1 notebook runs fine on Crestle, thankfully.

(ecdrid) #67

Can you share what were your errors? (will search and try to fix it up and re-run so that it can help others )


@ecdrid, in your file there was an error after the very first and later on there are PIL library errors and CUDA memory errors too. Are you able to run the script without any errors? Does anyone else have a .ipynb file that can run Lesson #1 on Google Colab without using the temporary workaround that I mentioned in my initial post?

(ecdrid) #69

This had a fix below where I had set a cuda variable after referring stack

After that it goes away…
That’s why I said you we’d to re run some cells twice or thrice to see the change…

Yep Cuda Memory errors are always there…

Don’t know why but it shouldn’t be there


I have faced this error during running the first model quick start section:

Failed to display Jupyter Widget of type HBox.

If you’re reading this message in the Jupyter Notebook or JupyterLab Notebook, it may mean that the widgets JavaScript is still loading. If this message persists, it likely means that the widgets JavaScript library is either not installed or not enabled. See the Jupyter Widgets Documentation for setup instructions.

If you’re reading this message in another frontend (for example, a static rendering on GitHub or NBViewer), it may mean that your frontend doesn’t currently support widgets.

what is the problem?
thanks a lot.

(Leonardo José Silvestre) #71

I have the following problem:

OSError: [Errno 5] Input/output error: “data/dogscats/tmp/x_act_resnet34_0_224.bc/data/__5.blp”

Every time I try to run, the number X __X.blp is different . If I open the directory, there are other __X.blp .

Any suggestions?

Thanks a lot.


I want to train upload my data into my current notebook. I tried putting it on filehosting sites and then tried downloading it viawget && unzip -d data
This doesn’t work. How can i get the data into my current environment?

(Medhat Omr) #73

It’s quit the opposite in my case

(Ben Hutchison) #74

Can anyone share a way to use kaggle-cli [] to load data from directly within Colaboratory notebook?

Im doing the Dog Breeds assignment.

Ideally I’d like to directory transfer data from Kaggle to Colab, without going through my machine, as the uplink on my home internet is very slow and the files are taking long time to get to Colab.

(Ben Hutchison) #75

Actually I found a solution to Kaggle -> Colab direct transfer.

I used the CurlWget extension Jeremy describes here [] and then invoke !wget ... from Colab to download into my Google Drive. Extremely fast transfer all within Google Cloud.

(Atul Krishna Singh) #76

For getting started with " colaboratory and fastai " you can follow this blog as I have successfully done this.

GPU not needed most of the time, how to turn it off?
(Ronaldo da Silva Alves Batista) #77

Hello Marcus.

Did you remember to activate the GPU backend, e.g, changed the runtime from CPU to GPU?

Runtime -> Change Runtime Type

The Hardware accelerator must be GPU

To check it, run the code:

import tensorflow as tf

I know it sounds silly but it’s easy to forget.



(Ronaldo da Silva Alves Batista) #78

Hello Ben,

There is the API key to Kaggle which makes the access straightforward, as if you were on Kaggle Kernels:

install Kaggle API: !pip install kaggle

API Credentials

To use the Kaggle API, go to the ‘Account’ tab of your user profile ( and select ‘Create API Token’. This will trigger the download of kaggle.json, a file containing your API credentials.

Place this file on your Google Drive anywhere.

With the next snippet you download your credentials to Colab and you can start using Kaggle API:

from googleapiclient.discovery import build
import io, os
from googleapiclient.http import MediaIoBaseDownload
from google.colab import auth


drive_service = build('drive', 'v3')
results = drive_service.files().list(
        q="name = 'kaggle.json'", fields="files(id)").execute()
kaggle_api_key = results.get('files', [])

filename = "/content/.kaggle/kaggle.json"
os.makedirs(os.path.dirname(filename), exist_ok=True)

request = drive_service.files().get_media(fileId=kaggle_api_key[0]['id'])
fh = io.FileIO(filename, 'wb')
downloader = MediaIoBaseDownload(fh, request)
done = False
while done is False:
    status, done = downloader.next_chunk()
    print("Download %d%%." % int(status.progress() * 100))
os.chmod(filename, 600)

Then you can use commands such as !kaggle competitions list

The Kaggle API docs has the full list of commands to submit, download data etc.

I hope I’ve helped


(Marcus) #79

Thanks for the tip, but I’m using PyTorch and not Tensorflow.

But I am using the cuda options of PyTorch and the runtime is set to GPU.

for batch, (x, y) in enumerate(train_loader):
    x = torch.autograd.Variable(x).cuda()
    y = torch.autograd.Variable(y).cuda() 

I’m assuming its something specific to my dataset. It only has around 50 columns.


Hi everybody, I’m new to this forum.

I have already cloned the repo into the Google Colab. But I don’t know how to open notebook in the repo, hope you guys will help me.