Colaboratory and Fastai

@manikanta_sManikanta

The notebook you provided on this website will not run for me in Colaboratory because it is read-only.

In Colaboratory I tried the method below to create a new notebook by coping the notebook you provided:

Select all cells in command mode (Ctrl+Shift+A)

Ctrl+C to copy all cells.

Open new notebook.

Ctrl+V to paste the copied cells.

But when I try to Open new notebook (New Python 3 notebook) if fails with this error message:

Notebook loading error
There was an error loading this notebook. Ensure that the file is accessible and try again.


[object Object]
Error: [object Object]
at d (https://colab.research.google.com/v2/external/external_polymer_binary.js?vrz=colab_20180222_085323-RC01_186629092:1135:347)
at Object.next (https://colab.research.google.com/v2/external/external_polymer_binary.js?vrz=colab_20180222_085323-RC01_186629092:1135:493)
at b (https://colab.research.google.com/v2/external/external_polymer_binary.js?vrz=colab_20180222_085323-RC01_186629092:522:42)
at

I saved a copy of the notebook you provided to my Google Drive.
When I try to Open Drive notebook, and select the notebook copy it crashes with the same error message above.

When I change permissions of the notebook (you provided) on my Linux PC using chmod -R 777 then upload the file it crashes with the same error message above.

What else can I try?

Thanks

Hi,

I hope you need not do it. A better way is to open with collaboratory and create a new copy of it within collaboratory itself. It should work. You need not copy all of them manually.

Regards,
Manikanta

@manikanta_sManikanta

The notebook you provided on this website will not run for me in Colaboratory because it is read-only.

While I am in Colaboratory I tried the method below to create a new notebook by coping the notebook you provided:

I Select all cells in command mode (Ctrl+Shift+A)

I use Ctrl+C to copy all cells.

But when I try to Open new notebook (New Python 3 notebook) if fails with this error message:

Notebook loading error
There was an error loading this notebook. Ensure that the file is accessible and try again.


[object Object]
Error: [object Object]
at d (https://colab.research.google.com/v2/external/external_polymer_binary.js?vrz=colab_20180222_085323-RC01_186629092:1135:347)
at Object.next (https://colab.research.google.com/v2/external/external_polymer_binary.js?vrz=colab_20180222_085323-RC01_186629092:1135:493)
at b (https://colab.research.google.com/v2/external/external_polymer_binary.js?vrz=colab_20180222_085323-RC01_186629092:522:42)
at

I can not open a new notebook because it fails with the error above.

What else can I try?

Thanks

@manikanta_sManikanta

Hi,

Each time I tried to open a new notebook in the Chrome browser it failed.

Then I tried to open a new notebook in the Firefox browser and it worked. I was then able to open a copy of the notebook you provided (I modified the permissions of the copy), run lesson1 and save it.

Thanks

@manikanta_sManikanta

Now to use the chrome browser. I clicked the upper right menu icon then selected New incognito window and was able to open a new notebook in the Chrome browser, then I select Open Drive notebook.

I can run Lesson #1 on Crestle with no problem but when I run the same notebook on Google Colab after installing the relevant libraries and downloading the relevant data I get this error message. As a temporary workaround I made the following revision and now the script runs until the end:

log_preds,y = learn.TTA()
probs = np.mean(np.exp(log_preds),0)
#begin extra code
probs = log_preds
def accuracy_np(preds, targs):
preds = np.argmax(preds, 1)
return (preds==targs).mean()
#end extra code
accuracy_np(probs, y)

My question is if anybody here happens to have a version of Lesson #1 that is able to run in full on Google Colab at the time of this posting? Did you have to make a similar modification to the code? If so, I would appreciate if you could share your entire .ipynb file so that I can compare my solution (see github link) to yours.

Thanks,
Paul

3 Likes

I had done so but it isn’t with the latest fast.ai version per se (it’s equivalent to 5 Months old fast.ai but its working seamless just as Jeremy has shown as)

And it’s not at all slow…

Actually it’s faster than AWS actually…(might be lucky)

Except Memory Issues…

https://colab.research.google.com/drive/1VsNWkErumsfPjqxquy9v9Z7U-EdrrU-q

2 Likes

Thank you for sharing your .ipynb file @ecdrid. I get a different error with your notebook as compared to my notebook, however. Maybe there are compatibility issues between fastAI and Google Colab? The unmodified Lesson #1 notebook runs fine on Crestle, thankfully.

Can you share what were your errors? (will search and try to fix it up and re-run so that it can help others )

@ecdrid, in your file there was an error after the very first learn.fit and later on there are PIL library errors and CUDA memory errors too. Are you able to run the script without any errors? Does anyone else have a .ipynb file that can run Lesson #1 on Google Colab without using the temporary workaround that I mentioned in my initial post?

This had a fix below where I had set a cuda variable after referring stack

After that it goes away…
That’s why I said you we’d to re run some cells twice or thrice to see the change…

Yep Cuda Memory errors are always there…

Don’t know why but it shouldn’t be there

Hi
I have faced this error during running the first model quick start section:

Failed to display Jupyter Widget of type HBox.

If you’re reading this message in the Jupyter Notebook or JupyterLab Notebook, it may mean that the widgets JavaScript is still loading. If this message persists, it likely means that the widgets JavaScript library is either not installed or not enabled. See the Jupyter Widgets Documentation for setup instructions.

If you’re reading this message in another frontend (for example, a static rendering on GitHub or NBViewer), it may mean that your frontend doesn’t currently support widgets.

what is the problem?
thanks a lot.

I have the following problem:

OSError: [Errno 5] Input/output error: “data/dogscats/tmp/x_act_resnet34_0_224.bc/data/__5.blp”

Every time I try to run, the number X __X.blp is different . If I open the directory, there are other __X.blp .

Any suggestions?

Thanks a lot.

I want to train upload my data into my current notebook. I tried putting it on filehosting sites and then tried downloading it viawget https://dl.dropboxusercontent.com/content_link/GjAxYko7UsfBU2Iio68DZ15qAOsWzVb20sKDyUh1yFfGFvQ1tR3EJeXfb2zio7Ia/file?dl=1 && unzip blackswhites.zip -d data
This doesn’t work. How can i get the data into my current environment?

It’s quit the opposite in my case

2 Likes

Can anyone share a way to use kaggle-cli [https://github.com/floydwch/kaggle-cli] to load data from directly within Colaboratory notebook?

Im doing the Dog Breeds assignment.

Ideally I’d like to directory transfer data from Kaggle to Colab, without going through my machine, as the uplink on my home internet is very slow and the files are taking long time to get to Colab.

Actually I found a solution to Kaggle -> Colab direct transfer.

I used the CurlWget extension Jeremy describes here [https://youtu.be/9C06ZPF8Uuc?t=744] and then invoke !wget ... from Colab to download into my Google Drive. Extremely fast transfer all within Google Cloud.

2 Likes

For getting started with " colaboratory and fastai " you can follow this blog as I have successfully done this.
[http://theailearner.com/2018/03/10/free-gpu-for-fast-ai-on-google-colab/]

2 Likes

Hello Marcus.

Did you remember to activate the GPU backend, e.g, changed the runtime from CPU to GPU?

Runtime -> Change Runtime Type

The Hardware accelerator must be GPU

To check it, run the code:

import tensorflow as tf
tf.test.gpu_device_name()

I know it sounds silly but it’s easy to forget.

Cheers,

Ronaldo

2 Likes

Hello Ben,

There is the API key to Kaggle which makes the access straightforward, as if you were on Kaggle Kernels:

install Kaggle API: !pip install kaggle

API Credentials

To use the Kaggle API, go to the ‘Account’ tab of your user profile (https://www.kaggle.com//account) and select ‘Create API Token’. This will trigger the download of kaggle.json, a file containing your API credentials.

Place this file on your Google Drive anywhere.

With the next snippet you download your credentials to Colab and you can start using Kaggle API:

from googleapiclient.discovery import build
import io, os
from googleapiclient.http import MediaIoBaseDownload
from google.colab import auth

auth.authenticate_user()

drive_service = build('drive', 'v3')
results = drive_service.files().list(
        q="name = 'kaggle.json'", fields="files(id)").execute()
kaggle_api_key = results.get('files', [])

filename = "/content/.kaggle/kaggle.json"
os.makedirs(os.path.dirname(filename), exist_ok=True)

request = drive_service.files().get_media(fileId=kaggle_api_key[0]['id'])
fh = io.FileIO(filename, 'wb')
downloader = MediaIoBaseDownload(fh, request)
done = False
while done is False:
    status, done = downloader.next_chunk()
    print("Download %d%%." % int(status.progress() * 100))
os.chmod(filename, 600)

Then you can use commands such as !kaggle competitions list

The Kaggle API docs has the full list of commands to submit, download data etc.

I hope I’ve helped

Ronaldo

2 Likes