Lesson 1@Floydhub - dogscats data read only?

Hi everybody!

I have been trying to do the first lesson of Part1 on Floydhub but ran in to a problem when accessing the dogscats dataset.

I did unzip the dogscat dataset as described in https://github.com/YuelongGuo/floydhub.fast.ai

Then i did start my floyd instance with following command
floyd run --gpu+ --env pytorch-0.3 --data festmeter/projects/dogscatsunzip/2/output --mode jupyter

The Jupyter Notebook for Lesson 1 works fine until the cell with the actual classification. I am getting an error that the dataset is read-only!

I couldn´t find any information on how to fix this, any help is highly appreciated.

1 Like

As I couldn´t find a solution for this very annoying problem i switched to Paperspace which works perfectly for Part 1. Life is so much easier when you have access to a console :wink:

I also ran into this problem. Can’t seem t fix it.:sob::sob:

@sai Can you help? we keep running into this problem in floydhub. It’s part 1 version 2 of the course.
It would be helpful.

What are you trying to download the data to? i.e. what is the location? You cannot download data into a mounted directory - is that the case here?

@sai floyd run --gpu+ --env pytorch --data fastai/datasets/cats-vs-dogs/2:/cats-vs-dogs --mode jupyter

This is the floyd command I used to mount the dataset.

Hi @Rinzin and @hkristen, here’s the problem: the FloydHub’s data partitions have read-only permission and the Learner class (declared in learner.py) needs write permission for its task. I have possible 2 workarounds:

  1. Replace line 22 of learner.py: self.tmp_path = os.path.join(self.data.path, tmp_name) with something like self.tmp_path = '/tmp' or self.tmp_path = '/output' if you want to return these files in the Output view.

  2. Copy the dataset folder in a folder with full permission( e.g. with --data fastai/datasets/cats-vs-dogs/2:/cats-vs-dogs): ! cp -R /cats-vs-dogs /cats-vs-dogs2. Then use PATH = "/cats-vs-dogs2/"

Hope it helps.


waw, thanks @redeipirati man, the 2nd solution worked like a charm.


@redeipirati Thank you for your help… Very grateful…:grinning::grinning: