Handling of HAM10000 data for learner exercise in Notebook

I am using the HAM10000 dataset (two zip folders containin ~ 10k images in total) for the exercise in 02_production notebook (use transfer learning to train a model for a different purpose).
My problem is that when I copy all the 10k images to the gradient environment (note I am using gradient.paperspace free version) navigating the folders becomes really slow and the web page tabs become stuck for minutes.
My question is, is there a way to use the images without needing to copy them into the gradient environment? What is the reason for the lag? What is a quick/better method for handling the HAM10000 dataset ?

Thanks for any advise


The number of files is very high, the Paperspace web UI has difficulty handling them.

You can manage them from CLI. In any notebook cell, you can run Linux shell commands starting with an exclamation mark.

!mkdir # make directory
!mv # move files
!pwd # present working directory

Thank you manju-dev. So my take-away from your reply is that in general just learning to navigate and edit the data directly from the CLI is a better option and avoids lagging issues.