Platform: GCP ✅

Thanks alot! Using the cloud shell helped me create the instance correctly.

Now i don’t understand where to type these 2 commands:

gcloud compute ssh --zone=$ZONE jupyter@$INSTANCE_NAME -- -L 8080:localhost:8080
git clone https://github.com/fastai/course-v3

Is it in in Google cloud shell you suggested or in my local downloaded Google Cloud SDK shell or they are the same thing?

I see i can access http://localhost:8080/tree after i do gcloud command in SDK shell (had to edit
$ZONE, $INSTANCE_NAME to their full string because SDK shell did not recognize export).
Surprisingly, i see a tutorials folder --> data, fastai, pytorch folders. Where did this come from/when did i define them?

I can’t do git clone in SDK shell after typing first gcloud command because it does not accept anymore input. How do i get the course-v3 stuff then?

Does the cloud shell interact with my local files?
Could you please explain big picture what is going on both locally and remotely when those 2 commands are used?

P.S My windows 7 does not have WSL to just copy paste linux commands from the guide

2 Likes

First one is in the cloud shell which does SSH into the server you created.
Just place the second after you sshed into the server. (Basically all of these happening inside the cloud shell).

I have also written a guide which does things step by steps with GCP. Try that if you still having problems.

3 Likes

You can type it in any shell after you download Google Cloud SDK.

Google DL Image preinstalls them for you as demo.

You are supposed to clone them in the Google Cloud instance you get connected to.

No, it doesn’t.

Basically, you are controlling the remote computer and projecting its shell into your local shell. SSH also tunnels the data in the Jupyter notebook to your localhost:8080, so that you can work directly with the notebook running in the remote server in the convenience of your browser.

3 Likes

No, as I mentioned billing questions should be asked through Google support.

2 Likes

Thanks a lot. I think I missed that part I guess.

Apologies if this is a silly question but how do I access the Jupyter notebook after creating the VM? I have an external IP address available, do I just copy paste it into my browser window along with the port (8888)?

See step 4 of the guide:

http://course-v3.fast.ai/start_gcp.html

2 Likes

Thanks for making that guide! It would be best if folks used the official setup guide however now that we have one. :slight_smile:

4 Likes

Replace all the stuff start with $ with the contents of that variable listed above. It should then work in win7.

2 Likes

That generally means you missed the jupyter@ part of the command in the setup guide:

gcloud compute ssh --zone=$ZONE jupyter@$INSTANCE_NAME -- -L 8080:localhost:8080

Note that if the first time you access an instance that you forget this, you’ll need to create a whole new instance, since it uses that first run to setup security.

2 Likes

Just a note to mention - the official fastai image already has that set up for you. ssh is just used to create the tunnel, so you don’t have to open any ports other than ssh.

3 Likes

Ah nice.

I found simplification for this, to avoid switching between users.
Deeplearning-platform allows adding jupyter-user metadata while creating an instance

So if we add this to metadata
--metadata='install-nvidia-driver=True,jupyter-user=[MAIN_USER_NAME]'
in installation command, an instance will not create jupyter user, and default jupyter (this one running on 8080) will be running on choosen user instead.

[MAIN_USER_NAME] is gcloud user in snake_case notation (e.g. Thomas Anderson will become thomas_anderson)

1 Like

I’m stuck in the tunnel…

It seems I can’t run gcloud on my Mac, so I’ve run the GCP commands (including gcloud) on my Linode Ubuntu instance. However, I use multiple domains on Linode to host multiple domains there, and this seems to be interfering with the ssh tunnel. Any hints on how to get this working?

gcloud compute ssh --zone=$ZONE jupyter@$INSTANCE_NAME -- -L 8080:localhost:8080

will run without errors, but I can’t access ip-of-my-linode-instance:8080/tree.

Can someone please help me with installing the kaggle api? I tried !pip install kaggle and !pip install --user kaggle but still get /bin/sh: 1: kaggle: not found error when trying !kaggle --help.

Maybe you need to install it globally with sudo
In ssh terminal (putty) you can try to do sudo /opt/anaconda3/bin/pip install kaggle

sudo - to install it globally
/opt/anaconda3/bin - to use anaconda3 env not default for sudo, python2.7 env

Notice that in jupyter-terminal you have jupyter user which has limited access.

Does that imply, I should create a python 2 notebook, not a python 3 notebook?

You should use python 3 because fastai requires python3.6

Basically it’s about that, there is “standalone” python2.7 installed without conda, used by default with sudo.
So when using sudo you have to tell that you want conda python (3.6) , not standalone python (2.7).

I read through the official guide and notice that ZONE=“us-west1-b”. So, if I am living in UK, is there any additional benefit for me to choose zone nearer to where I live? I realise that GCP charge a bit extra for European zone.

2 Likes

Question about boot disk size. If we want more than 120 GB, should we just increase the boot disk size, or should we have a separate disk (possibly SSD) that contains our data and somehow attach it? Anyone with some experience on that front?

I’m asking as some Kaggle image classification competitions have few hundred GBs of data.

3 Likes