Dog Breed Challenge Precompute Error

In trying to recreate Jeremy’s work, I keep getting an error when running “learn = ConvLearner.pretrained(arch, data, precompute=True)”, I keep getting a FileNotFoundError. Specifically:

FileNotFoundError: [Errno 2] No such file or directory: ‘/home/paperspace/fastai/courses/dl1/fastai/weights/resnext_101_64x4d.pth’

Any idea what I missed?

Thanks!

1 Like

@BOSTROVSKY, have you checked to see you have resnext_101_64x4d.pth in the folder listed above: ‘/home/paperspace/fastai/courses/dl1/fastai/weights?

Hmmm…I don’t even have a directory called weights. Where did I go wrong? Please limit the answer to fastai related failures. The other list is way too long. :slight_smile:

Hi,

That issue was discussed here:

You can download the weights directory by using the following code in paperspace:

wget http://files.fast.ai/models/weights.tgz

3 Likes

Thank you!

Hi,

I tried all the things mentioned before but I am still running into some issues.

The initial Error that i got was:
FileNotFoundError: [Errno 2] No such file or directory: ‘/home/paperspace/fastai/courses/dl1/fastai/weights/resnext_101_64x4d.pth’

I did not have a folder called weights.
I did have a folder called modules with a file called resnext_101_64x4d.PY (not a pth file)
I went into the fast ai folder and ran the command "wget http://files.fast.ai/models/weights.tgz"
It did put the “weights.tgz” file into the fastai folder.

I also tried creating my own weights’ folder and copying all the files from models into it. No luck.

I am quite new to programming so it might be that I have messed something up with all the bash commands and moving stuff around.
I have tried deleting the fastai folder, getting it back and repeating the whole process. No luck …

Somehow i feel like I made this way more complicated than it is …
Any ideas?

1 Like

Obviously … I am very new to this. I have to pack the weights.tgz file. :slight_smile:
A lot of things to learn.

To make the steps more clear…

  1. Go to the folder /home/paperspace/fastai/courses/dl1/fastai/

cd /home/paperspace/fastai/courses/dl1/fastai/

  1. Download the weights file

wget http://files.fast.ai/models/weights.tgz

  1. Unzip the file

tar -xvzf weights.tgz

Now the below steps will work fine :slightly_smiling_face:

learn = ConvLearner.pretrained(arch, data, precompute=True)

7 Likes

I had the same error as others with using resnext_100 described here. I assume the reason that the weights for for resnext_100 are not included in the fastai GitHub repository is because of their sizes. Wouldn’t it make sense to have calls to resnext_101_64x4d automatically trigger a download of the file? Are there any ip issues, or is that just something on the to-do-list? If helpful, I would be happy to implement and add to the fastai code base.

That would be helpful (to me).

For some reason wget is not working/installed on my Paperspace gradient, so I’m going to download the whole 1GB file and upload the weights I need…

It turned out not to be a good idea to download the weights and then upload them (even the individual weights are a bit big.

But I did manage to install wget on paperspace gradient:

apt-get update
apt-get install wget

I’m surprised those weren’t installed by default.

Thank you!! I too had the same issue and now its resolved.
I’m using clouderizer for this course and it’ll be great if downloading and extraction of the weight files can be bootstrapped like downloading the competitor’s data

See @jinilcs comment above for the actual commands

These commands can actually be found in the lesson1-breeds notebook right under the cells that are setting the path variable and hyperparameters like batch size etc…

I ran into this error because I downloaded the weights file BUT didn’t unzip :slight_smile: