Loading pre-trained weights from a local file rather than from a URL

You still need to pass an architecture, which is a callbacle returning your model, then you can pass pretrained_fnames as a named argument. The full signature is in the docs.

Thanks for your reply @sgugger. Adding AWD_LSTM triggers the download from amazonaws, throwing an exception in my case - because of firewall issues. So it seems the function language_model_learner never gets to run the if pretrained_fnames is not None statement.

That’s because you have to say pretrained=False if you don’t want to trigger the download.

That did the job, thank you @sgugger.

I’m running into exactly the same issue. Just to clarify, if I have downloaded the lstm_wt103.pth and itos_wt103.pkl files to .fastai/models is the following line of code correct or do I need to point to the location of these pretrained models using pretrained_fnames?

learn = language_model_learner(data_lm, AWD_LSTM, pretrained=False, drop_mult=0.3)

Thanks!

You’ll have to also add the location using pretrained_fnames.

Thanks, I’ve grabbed the pretrained model and dictionary from here: http://files.fast.ai/models/wt103_v1/ (no problems with my company firewall) but still haven’t had any joy when using pretrained_fnames= and pointing to the appropriate files. (size mismatch between model and dictionary apparently).

What did you find worked for you in the end?

Hi, I don’t have access to my files at the moment. Tomorrow morning I’ll have a look and let you know. I remember finding two versions of those and only one worked.

Thanks, much appreciated!

Ok so, the lstm_wt103.pth and itos_wt103.pkl files I had been using have size 177,091,123 bytes and 1,027,823 bytes respectively. If I’m not mistaken I got them from https://www.kaggle.com/mnpinto/fastai-wt103-1 (at least the size of the files at this link coincide with what I have on my disk). I hope this helps.

Thanks! Those are the same as I had and it turns out the issue was with the breaking changes mentioned here: Major new changes and features

Fixed by:
config = awd_lstm_lm_config.copy()
config['n_hid'] = 1150

and then passing config=config into the language_model_learner parameters.

1 Like

Ok cool, thanks to you then - I didn’t know about it, I’ll have to fix my own code when I resume working on that.

No problem. Don’t suppose you remember how you got around the firewall issue with text_classifier_learner (rather than language_model_learner)? It has a pretrained= parameter but no pretrained_fnames= parameter…

If you leave it with the defaults it tries to download the model from aws.

I’ve looked into my code and it went smoothly when I ran it back then, without specifying any pretrained_fnames parameter. If I’m not mistaken at that stage you don’t need the files anymore, because you should be taking advantage of the model you’ve just fine-tuned (starting from the pretrained one that was obtained via the files).

My code looks like this:

learn = text_classifier_learner(data_clas, AWD_LSTM, pretrained=False, drop_mult=0.5)
learn.load_encoder(‘ft_enc’)

OK, cool. I was just a bit concerned as I had to specify pretrained=False to get around the AWS download but then wasn’t pointing to anything else.

This is what I’ve got (which does run):

learn = text_classifier_learner(data_clas, AWD_LSTM, drop_mult=0.5, pretrained=False, config=config_clas)

learn.load_encoder('fine_tuned_enc')

Yeah I believe data_clas contains the current fine-tuned model.
Forget about it, obviously the fine-tuned model gets loaded into learn by the .load_encoder method.

Anyway, yours is a legitimate doubt since it’s not clear what needs to be retrieved via AWS at this stage. It’s been a while since I played with this and honestly don’t know. Maybe other more experienced users will answer the question…

1 Like

Thanks, good to know I’m not crazy for questioning this :slight_smile:

Hello, do we have a solution for this problem?

In fastai v2, I am trying to use text_classifier_learner with AWD_LSTM architecture with pretrained weights. However, due to company firewall issue, fastai is not able to download the weights from S3 bucket.

Is there a way that I can download the pretrained weight files manually and point text_classifier_learner to use them?

I appreciate your answers, I also found a solution to my problem, thanks! :handshake:

The following worked for me.

  1. Manually downloaded the weights file (e.g. wt103-fwd.tgz) from S3 location
  2. placed that in models directory under fastai home directory (~/.fastai/models)
  3. untar the file (tar -xvzf) in the same location.