Fellow deep learning practitioners,
I am currently trying to get a “large” training data on google drive so that I can refer to it in google colab. However, the process is awfully slow.
Have you found a way to efficiently (fast) transfer large dataset to google drive e.g. using other ways that the google drive UI ? I searched google a little bit but apart from FileZilla pro, I could not find a satisfying way. Plus FileZilla works with FTP and this is apparently not a fast way of dealing with files.
If the solution requires to move to more professional platforms (AWS, Paperspace, Google Cloud, …) so be it … But if there is a free way to achieve this, I am all ears.
Thank you for your support and your time !