Platform: GCP ✅

Any questions related to Google Cloud Platform (GCP) can be posted here.

Note that this is a forum wiki thread, so you all can edit this post to add/change/organize info to help make it better! To edit, click on the little pencil icon at the bottom of this post. Here’s a pic of what to look for:



Thanks for sharing.
I tried to create an instance with the recommended specs, but the pricing is nowhere near $0.53/h, more like $1.36

Are you sure this is a n1-highmem-8 (8vCPUs + 52GB) with 1 GPU NVIDIA Tesla P100 ? What region/zone did you use to get this pricing ?


Also, using automatic CPU platform selection usually results in having older CPU models selected, but selecting skylake increases costs. Does it matter ?

EDIT: Reading the tutorial further, I can see an shell command to create the instance using a “n1-standard-8”. Concluding the pricing paragraph has a typo. But this doesn’t decrease the price much lower, now around $1.29/h

You need to use the --preemptible flag (shown in the tutorial) to get the cheap pricing.

We’ll add some details on the page about that.


After using the instructions from the above and trying the first lesson I see that
‘‘untar_data’ is not defined’

There is a forum thread that suggests reinstalling pytorch and fastai, but i am not sure if this is the recommended approach for GCP box.

Any ideas?

That’s because your fastai library isn’t up to date. I’m not sure why, if you followed the steps, but look at the update guide, specifically the update the library part.

scary how little setup I needed to do. All my usual tools like tmux and htop are already setup, and jupyter auto-started. this image pytorch-1-0-cu92-experimental is gold :slight_smile:


works fine, thank you!

That is all thanks to @b0noi, who created this terrific image! :slight_smile:


Also check this guide with preemptive instances:

Basically, you can run some commads with the google cloud shell to save time.

It’s cheap and you can keep data and tools even after you terminate the instance.

And it uses a systemd sevice to start the jupyter. So, no SSH needed to access notebooks. Just need to create the server and use it.

 purnima@purnima:~$ export CLOUD_SDK_REPO="cloud-sdk-$(lsb_release -c -s)"
purnima@purnima:~$ echo "deb $CLOUD_SDK_REPO main" | sudo tee -a /etc/apt/sources.list.d/google-cloud-sdk.list
[sudo] password for purnima: 
deb cloud-sdk-bionic
purnima@purnima:~$ curl | sudo apt-key add -
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  1326  100  1326    0     0   2135      0 --:--:-- --:--:-- --:--:--  2131
purnima@purnima:~$ sudo apt-get update && sudo apt-get install google-cloud-sdk
E: Malformed entry 1 in list file /etc/apt/sources.list.d/google-cloud-sdk.list (Component)
E: The list of sources could not be read.

I am getting the above error after adding google could keys.


One thing if anyone built image on GCP before the course they will have to clone the course content.

One way to do that inside server is ,

sudo su jupyter
cd /home/jupyter
git clone

You can see them,

$ ls -l
total 12
drwxr-xr-x 5 jupyter jupyter 4096 Oct 23 02:18 course-v3
drwxr-xr-x 5 jupyter jupyter 4096 Oct 21 17:23 tutorials


Hi, I followed the steps for using GCP provided by this link:

It works, I was able to run the lesson 1 jupyter notebook.

However, when I want to perform the update (instructions from, the “sudo /opt/anaconda3/bin/conda update fastai” command asks for the sudo password I did not set previously … any idea what is the password or how could I set it?

Go sudo su jupyter!. it doesnt ask for password?

Well, it seems that the same command without the sudo works… I leave it as it is for now if sudo is not needed :slight_smile:

I am on Windows 7 so there is no Windows Subsystem for Linux.
How am i able to run these setup code?

export IMAGE_FAMILY="pytorch-1-0-cu92-experimental" # or "pytorch-1-0-cpu-experimental" for non-GPU instances
export ZONE="us-west1-b"
export INSTANCE_NAME="my-fastai-instance"
export INSTANCE_TYPE="n1-standard-8"
gcloud compute instances create $INSTANCE_NAME \
        --zone=$ZONE \
        --image-family=$IMAGE_FAMILY \
        --image-project=deeplearning-platform-release \
        --maintenance-policy=TERMINATE \
        --accelerator='type=nvidia-tesla-p100,count=1' \
        --machine-type=$INSTANCE_TYPE \
        --boot-disk-size=120GB \
        --metadata='install-nvidia-driver=True' \

Use Google cloud shell:

No need to setup anything locally.


In terms of zone, usually us-west-1 gets the cheapest pricing.

1 Like

That is usually because you was running the jupyter_notebook installed by default by the GCP image. When you start the server, make sure that it is starting the one from your environment.

Do you run into any problem when executing untar_data?

Is this a thread where I can post issues with respect to billing in GCP? Jeremy mentioned that there are representatives of AWS, Google who can help with setting up. I have already set up but have some issues with billing. If I can ask here, @sgugger please ‘like’ this reply so that i can go ahead and ask questions regarding the billing.