Using custom pretrained ulmfit with fastai library

#1

Hi,

I am getting a little desperate. I pretrained and finetuned an ulmfit with nb_12 code and I am trying to use the weights to create a classifier with fastai (version 1.0.58) text_classifier_learner. I am doing this because I need both classifier and regression. I thought this would be easiest with the library. I have the following code and get the respective errors:

config = awd_lstm_lm_config.copy()
config['n_hid'] = 1150
learn = language_model_learner(data_lm, AWD_LSTM, config=config,pretrained= False, drop_mult=0.3)
learn.load_encoder('finetuned_enc')

RuntimeError: Error(s) in loading state_dict for AWD_LSTM:
	Missing key(s) in state_dict: "encoder.weight", "encoder_dp.emb.weight". 
	Unexpected key(s) in state_dict: "emb.weight", "emb_dp.emb.weight". 

load_pretrained gives me:

learn.load_pretrained('finetuned_ulmfit/finetuned.pth', 'finetuned_ulmfit/vocab_lm.pkl') 

KeyError: '0.encoder.weight'

Clearly, this is a mismatch of weights, but I can’t figure out why I am getting it.

0 Likes

#2

Sorry, I wasn’t thinking hard enough. I solved it by renaming the dictionary keys.

0 Likes