Transfer your files easily using google drive

So I’m using GCP and there was no easy way of getting files there (like scp) without creating buckets, so I’ve found another way.

  1. Make a zip/rar of your data and upload to google drive.
  2. Make the file public
  3. On remote machine installl gdrive and dowload by typing in terminal:
    pip install gdown
    gdown https://drive.google.com/uc?id=file_id

Where file_id is this part of the shareable url: https://drive.google.com/file/d/1oc7HA5pHOr_UlrqlgbUSOvXmkHNd1IV0/view?usp=sharing

2 Likes

This is great, but why would one download the data to local, then upload it to VMs?. What are you using this for? :slight_smile:

I was dowloading Xrays of examples of hip dysplasia in dogs and finding files for this takes a lot of time, and I don’t pay for using my laptop.

1 Like

Great! So you downloaded the dataset, cleaned it manually and then uploaded to drive and got it to VM from there?

1 Like

Almost, I’ve made the dataset by hand (and it turns out this is really non trivial and the data quality was quite meh and learning failed miserably), you can’t get some data from simple google search, sadly.

This sounds interesting…

@Blanche : won’t this suffice for your needs?
https://cloud.google.com/sdk/gcloud/reference/compute/scp

5 Likes

Yes it will, I was googling this wrong, because I’ve only seen info in gcp manual about transferring from/to data buckets.

1 Like

Do you know a good way to work with Google drive?
I accidentally created 50 thousands small files in my main Google drive directory.
And now it is a huge pain in the ass to remove them through Google drive UI, which is super slow.

Have you tried google colab?
Then mount your drive with this code
from google.colab import drive drive.mount('/content/gdrive')
Then you can work with your files with normal python commands.

1 Like