Fastai v2 - load ULMFIt encoder only (not touching learner)

Hi everyone,

We have trained a good language model, using awd_lstm architecture. Then we have saved it using learner.save()

Now we would like to build our own classifier. We can get the AWD_LSTM from fastai.text.models, but then we need to separately store at least the vocab_sz, emb_sz, n_hid and n_layers - the required arguments.

We thought about creating a new learner, loading the saved artifact and then extracting the encoder from there, but this seems like overkill.

Is there a cleaner way to load the encoder from the saved learner.save() file?

You might want to use ‘Learner.save_encoder’ and Learner.load_encoder.