Okay so here is one way:
Under google cloud platform main menu go to storage.
Create storage bucket.
Put a test csv file in.
Then, go to your compute engine instances.
Find your fast AI instance, under “connect” column — click SSH.
In resulting window follow (most) instructions starting @1:20 outlined in the video here (but read additional points below before that):
You will have to install GCSfuse to connect your bucket to your virtual machine:
After this script above, authorize access so that you dont get security/ bucket access error by running the following:
gcloud auth application-default login
(from https://esc.sh/blog/mount-gcs-bucket-linux/)
Then follow instructions and authorize yourself.
Copy link code, etc etc etc.
When video instructions make it to this script:
gcsfuse YOURBUCKETNAMEHERE mnt/gcs-bucket
Change it to:
/usr/bin/gcsfuse BUCKETNAMEHERE /mnt/gcs-bucket
Why?
Follow this thread
if
/usr/bin/gcfuse
doesn’t work you can find where your gcsfuse instance got installed by running the following command:
whereis gcsfuse
finish the video and that’s it, should be good to go.
Go to jupyterlab.
Create notebook.
Run your checks:
import pandas as pd
import numpy as np
data = pd.read_csv(“gs://YOURBUCKETNAMEHERE/YOURTESTFIEL.csv”)
data.head(5)
Still wondering if there is a way to read stuff from my desktop in VMs JupyterHub…