How to set up env using Google Cloud?


#1

Hi, is there anybody using Google Cloud for this part 1 course? How to modify the AWS related script? It seems Google provides more generous free plan than AWS, by giving $300 for anything including GPU use.

I’ve just finished Andrew Ng’s online ML course. Fortunately I found fast.ai and after watching couple of videos I decided to begin this terrific part 1 course :slight_smile:

Thanks for help in advance,
and many many thanks to Jeremy & Rachel giving us this great opportunity.


In-class discussion: Introductory workshop
Paperspace setup help
#2

I’m using Google Cloud but I’ve changed a few things to use it in combination with Anaconda3 and Keras2. I’ll post the script I’ve used.


#3

I’ve made a few changes to install-gpu.sh from the https://github.com/fastai/courses.git but nothing specific is necessary to make it work on Google Cloud.


# ensure system is updated and has basic build tools
sudo apt-get update
sudo apt-get --assume-yes upgrade
sudo apt-get --assume-yes install tmux build-essential gcc g++ make binutils
sudo apt-get --assume-yes install software-properties-common

# download and install GPU drivers

# see https://cloud.google.com/compute/docs/gpus/add-gpus#install-gpu-driver
echo "Checking for CUDA and installing."
# Check for CUDA and try to install.
if ! dpkg-query -W cuda; then
  curl -O http://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/cuda-repo-ubuntu1604_8.0.61-1_amd64.deb
  dpkg -i ./cuda-repo-ubuntu1604_8.0.61-1_amd64.deb
  sudo apt-get update
  sudo apt-get install cuda -y
fi

# verify that GPU driver installed
sudo modprobe nvidia
nvidia-smi

sudo apt-get install libcupti-dev

# install Anaconda for current user
mkdir downloads
cd downloads
wget "https://repo.continuum.io/archive/Anaconda3-4.3.1-Linux-x86_64.sh" -O "Anaconda3-4.3.1-Linux-x86_64.sh"

bash "Anaconda3-4.3.1-Linux-x86_64.sh" -b

echo "export PATH=\"$HOME/anaconda3/bin:\$PATH\"" >> ~/.bashrc
export PATH="$HOME/anaconda3/bin:$PATH"
conda install -y bcolz
conda upgrade -y --all

# install and configure theano
conda install theano pygpu
echo "[global]
device = cuda0
floatX = float32
[cuda]
root = /usr/local/cuda" > ~/.theanorc

# install and configure keras
conda install keras
mkdir ~/.keras
echo '{
    "epsilon": 1e-07,
    "floatx": "float32",
    "backend": "theano",
    "image_data_format": "channels_first"
}' > ~/.keras/keras.json

# install cudnn libraries
wget "http://files.fast.ai/files/cudnn.tgz" -O "cudnn.tgz"
tar -zxf cudnn.tgz
cd cuda
sudo cp lib64/* /usr/local/cuda/lib64/
sudo cp include/* /usr/local/cuda/include/

# configure jupyter and prompt for password
jupyter notebook --generate-config
jupass=`python -c "from notebook.auth import passwd; print(passwd())"`
echo "c.NotebookApp.ip = '*'
c.NotebookApp.password = u'"$jupass"'
c.NotebookApp.open_browser = False
c.NotebookApp.port = 9999" >> $HOME/.jupyter/jupyter_notebook_config.py

# clone the fast.ai course repo and prompt to start notebook
cd ~
git clone https://github.com/fastai/courses.git
echo "\"jupyter notebook\" will start Jupyter on port 9999"
echo "If you get an error instead, try restarting your session so your $PATH is updated"

#4

Nice to hear that you are using Google Cloud with no issue. I will try your “install-gpu.sh” on Anaconda3 & Keras2 as you did.

Thank you very much!


#5

Okay, good luck! Don’t hesitate to ask if you can’t make it work. Please note that I also had to install theano, pygpu and libcupti-dev. I’ve edit the script above to include these.


#6

Got it. I am new to cloud env. and maybe ask you again :slight_smile:
Thanks much for your kind advice!


#7

I noticed another tiny change (my apologies for this):
~/.theanorc should be:

device = cuda0
floatX = float32
[cuda]
root = /usr/local/cuda```

#8

Oh that’s fine. I am planning to setup env this weekend and start learning :slight_smile: I will post question if I have any issue.
Thanks much for your concern!


(layla.tadjpour) #9

I wanted to try Google Cloud for the same reason but it did not let me use my $300 credit for GPUs. It seems only works for CPU usage!!


#10

Hmm… I will try if I can use GPU.
It seems Stanford class students use both $100 class credit + default $300.

http://cs231n.github.io/gce-tutorial/

Thanks for your comments!


#11

If you select “US” zones such as “us-west1-b”, there is an option to use GPU.


#12

I put up a mini guide here for what I did to retro-fit the AWS install script/video to work with GCP. Please feel free to let me know if something looks unclear.


#13

Thank you for sharing your script with us! This should automate a lot of work using gcloud web console.


(layla.tadjpour) #14

thanks. I will try it.


(John) #15

Thanks for putting together this guide. With the $300 free credit, GCP is indeed a very cost effective alternative to AWS. I also verified that the credit is getting adjusted against the GPU cost too. Even without prior unix experience, I was able to follow through the steps in your guide, except the final step to start Jupyter Notebook services on the server. While executing the command, I’m getting the following error

~$ sudo auth_and_start.sh jupyter notebook
Traceback (most recent call last):
File “/usr/local/bin/lookup_value_from_json”, line 6, in
data = json.load(open(sys.argv[1], ‘r’))
FileNotFoundError: [Errno 2] No such file or directory: ‘client_email’

Am I missing something? Thanks for your time!


#16

Hi,
client_email is pulled from the google_service_key.json file that you should have downloaded from Google Cloud. Search in the readme for the line that contains “service key”.

Note that the only reason to use the auth_and_start.sh bit is so that you can use a Google bucket or say BigQuery from your python notebook. If you are using only the fast.ai course notes, you can very well open up the jupyter notebook by doing

~$ jupyter notebook 


#17

Hi Friends, am new to ML and was trying to use GCM sinc ethey offer $300 in free credits. I configured an instance, as mentioned in the post - http://cs231n.github.io/gce-tutorial/, on GCM. I also installed the gcm cloud tools for my machine. But am not clear on how to proceed further. There are couple of scripts - one mentioned by @sebastian and there is a github post by @eshvk. Can someone guide me on how to proceed further? I would like to setup my instance with all necessary stuff, in GCM, instead of AWS. Pls let me know if you need more details. Thanks.


#18

I’d love to help, but can you please be a bit more specific what problems/issues you encountered?

If it’s just the basics of Google Cloud, then it might be helpful to first try to get a cheap tiny instance up and running, and see if you can ssh into that instance, install conda etc. The documentation of Google Gloud is excellent, see for example https://cloud.google.com/compute/docs/quickstart-linux for some basics.

Once you have an instance with a GPU running, then GCE/AWS are similar, but in general I find the dashboard and tools offered by Google much easier to use, at a lower price.


#20

Hi , Thanks. I created an instance in GC, as per the post - http://cs231n.github.io/gce-tutorial/, but I was not sure how do I use the install-gpu.sh that you had given. Also, am not quite clear on how to install conda etc. Am new to Google Cloud and ML and just trying to get started.


#21

Did you look at the mini guide eshvk posted? (See https://github.com/eshvk/gcp-dl) I think it might be more useful than the tutorial for cs231n. It also includes installing Anaconda. (The CS231n tutorial uses virtualenv instead of Anaconda so that is a bit different than what Jeremy is using in this course.)

You can also ignore my script, because that was more a hackish solution and uses Python3 and Keras2.