Loading images from Google Cloud, suggested hardware for large dataset

I was able to get files into google cloud storage - Google Cloud Platform,

but then faced the same problem wxng faced, so had to work around it and put files from my local machine onto VM and then move from one VM directory to jupyter directory. In order to move all 8,000 + files at once, I zipped them first. So i uploaded a zipped file.

So i pretty much followed these instructions:

Then to move uploaded zip file i did:
sudo mv /home/YOURDIRECTORY/testfile.zip /home/jupyter/

with yourdirectory part popping up in your SSH transfer file window… Will be something like /home/XXXXXX

Once I moved my file I opened jupyter lab.
Started a new python file and executed below:

‘’’
import zipfile as zf

files = zf.ZipFile(‘testfile.zip’, ‘r’)
files.extractall(’/home/jupyter/’)
files.close()

i would recommend creating another folder to extract to… like /home/jupyter/data

in order to remove your zip file you will have to do it via SSH window.
navigate to your jupyter directory. something like

cd …

then see what’s available
ls -la

cd your way into jupyter

and once you are in the same directory as your zip file do

sudo rm “filename”

hope this helps

1 Like