Loading saved language model

Is there any way to load a pretrained language model similar to ConvLearner.pretrained? Whenever I restart the training md = LanguageModelData(PATH, TEXT, **FILES, bs=bs, bptt=bptt, min_freq=10) always takes a bit. I’ve tried dumping the entire object with pickle but that doesn’t seem to work.

1 Like

Have you tried save() and load()?

I’ve been able to do save_encoder() and load_encoder() as in the lesson4 imdb notebook, although I haven’t been successful saving and running the whole model via save() and load(). This is the error I get,

While copying the parameter named 0.encoder.weight, whose dimensions in the model are torch.Size([49346, 200]) and whose dimensions in the checkpoint are torch.Size([49173, 200]), ...

I guess the first dimension is number of words, and maybe if I used the same dataset for training/validation each time it’d be the same, but I haven’t tried that yet.

So i’ve been able to load it successfully but the issue is that i m lazy and don’t like waiting for that minute or two of md = LanguageModelData(PATH, TEXT, **FILES, bs=bs, bptt=bptt, min_freq=10)

So afterwards i do a learner.load('em_size_500_bs_32_cycle_5_11_25') and everything is works. Was just trying to see if there was faster way to load the language model

2 Likes

@cheeseblubber,

Have you found a way to save the datamodel?

I am running the notebook on a local machine and it take more than 30 minutes to build the datamodel md

So if you read Rob’s comment it tells you how to save and load the built data models. I was referring to loading the model in memory which takes ~1-2 min. What you want is to call the save function on the model.

@cheeseblubber,
Maybe I did not articulate well. Its a two part question… can I save md to hard drive and can I read it into memory?

Imagine I started running the notebook cell by cell and reached the cell where the LanguageModelData is built.

md = LanguageModelData.from_text_files(PATH, TEXT, **FILES, bs=bs, bptt=bptt, min_freq=10)

This process takes 30-40 minutes on my local machine. Imagine I want to stop the kernel now. Can I save the object md to the hard drive?

later I may start the notebook again, run all the necessary imports now can I load memory with something like

md = load_from_pickle(blah blah)??

2 Likes

Did anyone find a solution to this? I’m struggling

I’m with exactly the same problem. I’m building a language model in my local machine for some data I gathered and it took more than 5 hours!

I see that the TEXT variable is pickled:

pickle.dump(TEXT, open(f'{PATH}models/TEXT.pkl','wb'))

but it is useless without the complete model. Tips for speeding up the process also would be nice. I’ve already reduced the parameters to:

bs=32; bptt=40

It is taking so much time. Maybe it is using virtual memory.

Which Language model are you guys using? The one in nlp.py is the old version and does a bunch of computes after initialization see:


Where as:

does not. Try using the new languageDataModel in text and not in nlp. Since it is loading it into memory it should be much faster since it doesn’t have to do processing in initialization.

1 Like

My:

LanguageModelLoader.__module__

really is:

'fastai.nlp'

So after the main import commands, I did:

from fastai.text import LanguageModelLoader

I’m running it now. I’ll report when it finishes.

BTW, I’m using my own data. The dataset is 4.5 times the imdb dataset. The language is Portuguese. I don’t know the impact of this.

Now I also reduced bptt to 20.

I did a quick search on the repo for from fastai.text import to find this nugget in dl2/imdb.ipynb

"At Fast.ai we have introduced a new module called fastai.text which replaces the torchtext library that was used in our 2018 dl1 course. The fastai.text module also supersedes the fastai.nlp library but retains many of the key functions."

Have you found a way so far?

Hey, has anyone found a way to do that?

There’s a lesson in part2 on how to use fastai.text. See lecture notes for example https://medium.com/@hiromi_suenaga/deep-learning-2-part-2-lesson-10-422d87c3340c