Google Cloud Platform

Hello,

I am following the guidance here: https://course.fast.ai/start_gcp.html

Is anyone else having quota problems with GCP when trying to set up an instance?

$export IMAGE_FAMILY=“pytorch-latest-gpu”
$export ZONE=“us-west2-b”
$export INSTANCE_NAME=“my-fastai-instance”
$export INSTANCE_TYPE=“n1-highmem-8”
$gcloud compute instances create $INSTANCE_NAME
–zone=$ZONE
–image-family=$IMAGE_FAMILY
–image-project=deeplearning-platform-release
–maintenance-policy=TERMINATE
–accelerator=“type=nvidia-tesla-p4,count=1”
–machine-type=$INSTANCE_TYPE
–boot-disk-size=200GB
–metadata=“install-nvidia-driver=True”
–preemptible

I receive this error in return:

ERROR: (gcloud.compute.instances.create) Could not fetch resource:- Quota ‘GPUS_ALL_REGIONS’ exceeded. Limit: 0.0 globally.

What quota do I change on the free account in ‘IAM and accounts’ > ‘Quotas’? I would like to have Standard Compute + Storage

I have linked my billing account to the project.

Thank you for any advice in advance.

Rachel

Hey I have a similar problem … I thought it doesn’t work since I am from Europe and I need a different zone.

Following it says:
ERROR: (gcloud.compute.instances.create) Could not fetch resource:

  • Invalid value ‘“europe-west4-a“’. Values must match the following regular expression: ‘[a-z] (?:[-a-z0-9]{0,61}[a-z0-9])?’

Please use the appropriate topic:

1 Like

Hey, @datalass1.

I’m just getting started with Part 1 2019 using GCP myself, and I ran into the same error message about “Quota ‘GPUS_ALL_REGIONS’ exceeded. Limit: 0.0 globally.” while following the setup directions you posted.

A little Googling turned up this suggestion. I followed the instructions in the only answer posted, and by setting the filter (1) and then “EDIT QUOTAS” (2), I was able to submit a request to increase my GPU count from 0 to 1.

My request hasn’t been approved yet, but it seems like it has potential :crossed_fingers:

1 Like

Just a quick update: I received a couple of emails from Google Cloud Support between yesterday and today, giving me periodic updates about their review of my request. About 24 hours after my request was submitted, I received an email saying that my request had been approved.

When I reran the code in the (setup instructions)[https://course.fast.ai/start_gcp.html], my instance was created without any issues or error messages.

I also notice that this exact workaround is provided in the Google Cloud Platform instructions. (Looks like they were added yesterday.) Awesome :+1:

Thank you @hoxie this was exactly what I needed to do and I can now ssh to GCP. I’m enjoying my free credits and hope you do too!

Hi, how much time did it take for your request to get approved?

In my case, a few hours.

A trick that most of you know about but maybe some windows users don’t, is that you can create an alias for the start command (it’s really long) so that you dont have to type or copy paste each time

gcloud compute ssh --zone=$ZONE jupyter@$INSTANCE_NAME -- -L 8080:localhost:8080

In ubuntu, go to your home folder (cd ~) and you should have a .bashrc file, it’s hidden so ls won’t show it but you can see it with ls -a. Then open it in an editor, I prefer nano so

nano .bashrc then scroll to the bottom and add the line
alias gc='gcloud compute ssh --zone=$ZONE jupyter@$INSTANCE_NAME -- -L 8080:localhost:8080'
filling in your own details.

Save and exit to terminal and enter source .bashrc to load the changes. Then all you need to do to connect to your instance is type gc!

Steps without text

  1. cd ~
  2. nano .bashrc
  3. Add alias gc='gcloud compute ssh --zone=$ZONE jupyter@$INSTANCE_NAME -- -L 8080:localhost:8080' with your details at the bottom of file
  4. Save and exit
  5. source .bashrc
  6. Type gc to start instance

Hope this helps

7 Likes

First, thanks for posting this. I followed your instructions; however it gives me this:

I saved it pushing: Ctrl + x
then, it asked me to save it in: .bashrc and i just hit enter
but then i go back and it showed me what is on the picture and it stays there. When i type the whole command, it works; but just typing ‘gc’ doesn’t.

Hey Christian, sorry it isn’t working. Can you post the exact line you put in your .bashrc file? To save make sure you type ‘y’ then enter. Did you make sure to refresh it by running source .bashrc? Here’s my exact line, looks like we chose the same instance name haha.

alias gc='gcloud compute ssh --zone=us-west2-b jupyter@my-fastai-instance -- -L 8080:localhost:8080'

Post more details and we’ll get it figured out. It’s a huge time saver so worth any trouble imo.

1 Like

Thanks for the willingness to help out. I figure it out after some tries. Your documentation is well done. The only thing that i was doing wrong is putting a space like this between the ‘’=" sign:

alias gc = 'gcloud…

I fixed it by having no space in between like is in your post:

alias gc='gcloud…

Now it’s working. Thanks for this time saver!

I have my data stored in google drive, how can I access this file in GCP?

[EDIT] I tried wget

This downloads the data to the VM, however, I didn’t want to DOWNLOAD the data but to simply access the google drive via VM.

So I have even more basic question.
I am running windows 7.
I am running my google compute engine and Jupyter hub in a virtual machine, ubuntu.

When I download images, first lesson, let’s say this image :
/home/jupyter/.fastai/data/oxford-iiit-pet/images/wheaten_terrier_36.jpg’


#1
Where is it downloaded to?
Where can I find it?

Is it on my local virtual machine / ubuntu?
Is it on my google compute engine instance? If so, how do I get access to them there?

When I go to Jupyter hub i start with tutorials/fastai/nbs/dl1…
Can’t find images there…Or in any other folder there for that matter…


#2
Another question is, when I try to play with my own dataset…
Where do I ingest files so that I can access them? To my Jupyter Hub?

1 Like

I’m new here as well, but let me take a shot at answering your questions.

#1
Those images are downloaded to the Google Compute Engine instance. When you ran gcloud compute ssh --zone=$ZONE jupyter@$INSTANCE_NAME -- -L 8080:localhost:8080 you were logged into that GCP instance. Whatever you do on the Jupyter notebooks will only affect that instance and not your local system.

The images reside in the /home/jupyter/.fastai/data folder of your GCP instance. You won’t see these folders in Jupyter as they’re hidden, but you can now use the terminal window we used to run the above command to explore these files with usual linux commands:

cd /home/jupyter/.fastai/data
ls -la

#2
You can find the way to build your own dataset on the lesson2-download notebook. If you already have the dataset on your local machine, you can just upload it to the Jupyter hub as well.

thank you very much for your prompt and straight forward reply.

Follow up to #2.
So if I have my dataset on my virtual machine. It’s over 6,000 images. Dragging and dropping into Jupyter hub prompts me to click “upload” on every file… So clicking 6,000 + times doesn’t look efficient. So there has to be a better way.

Googled around and found (https://stackoverflow.com/questions/34734714/ipython-jupyter-uploading-folder) other people having the same issue with (official github.com/jupyter isssue) resolution being:

"Convert it into a single Zip file and upload that. to unzip the folder use the code down bellow

import zipfile as zf
files = zf.ZipFile("ZippedFolder.zip", 'r')
files.extractall('directory to extract')
files.close()

"

That’s great, but typing this in my GCE Jupyter Hub:

import os
import zipfile as zf

os.getcwd()
os.chdir('/')

ls

reveals following GCE folders…

  • bin/
  • home/
  • lib64/
  • opt/
  • sbin/
  • usr/
  • boot/
  • initrd.img@
  • lost+found/
  • proc/
  • srv/
  • var/
  • dev/
  • initrd.img.old@
  • media/
  • root/
  • sys/
  • vmlinuz@
  • etc/
  • lib/
  • mnt/
  • run/
  • tmp/
  • vmlinuz.old@

And I am not really sure if there is a way to navigate to my VM desktop.

Are there any solutions vs what I described and/or is there a way to navigate from GCE to VM desktop in Jupyter hub, or better yet, is there a way to just store these images in Google Cloud Storage and then pull them from there?

Okay so here is one way:

Under google cloud platform main menu go to storage.
Create storage bucket.
Put a test csv file in.

Then, go to your compute engine instances.
Find your fast AI instance, under “connect” column — click SSH.

In resulting window follow (most) instructions starting @1:20 outlined in the video here (but read additional points below before that):

You will have to install GCSfuse to connect your bucket to your virtual machine:

After this script above, authorize access so that you dont get security/ bucket access error by running the following:

gcloud auth application-default login
(from https://esc.sh/blog/mount-gcs-bucket-linux/)

Then follow instructions and authorize yourself.
Copy link code, etc etc etc.

When video instructions make it to this script:
gcsfuse YOURBUCKETNAMEHERE mnt/gcs-bucket

Change it to:
/usr/bin/gcsfuse BUCKETNAMEHERE /mnt/gcs-bucket

Why?
Follow this thread

if
/usr/bin/gcfuse

doesn’t work you can find where your gcsfuse instance got installed by running the following command:
whereis gcsfuse

finish the video and that’s it, should be good to go.
Go to jupyterlab.
Create notebook.
Run your checks:

import pandas as pd
import numpy as np
data = pd.read_csv(“gs://YOURBUCKETNAMEHERE/YOURTESTFIEL.csv”)
data.head(5)

Still wondering if there is a way to read stuff from my desktop in VMs JupyterHub…

1 Like

I don’t think there’s a trivial way to read files in your desktop directly from your VM. You’d either have to copy it over to some cloud location like you’ve just done, or you can use the scp command to transfer the file directly to the VM.

Instructions to do this on GCP is here. Check the examples section. Its pretty much the same as the ssh command, and you can transfer whole folders without making them into zip files.

Nice! Thanks! I will give it a try as well. Not sure how fast google cloud storage is vs having files directly on VM.

Update, just found another way, even simpler, but one file at a time … : (
Same as before, click SSH under google cloud platform.
Click on gear looking icon, right hand corner. There is an option to upload a file…

Taken from here