Hi Aaron
It will be similar (but different) to this below.
Regards Conwyn
Train Wiki IMDB
Google Colab pwd to /content
save the model without the head
learnlm.save_encoder(‘finetunedF’)
#Note it save it in path/models where path is /root/.fastai/data/imdb
#Copy it for safety to my Google Drive
!cp /root/.fastai/data/imdb/models/finetunedF.pth /content/gdrive/MyDrive
#now pickle the data loaders
import pickle
pickle.dump( dls_lm , open( “savelm.p”, “wb” ) )
#And copy for safety
!cp /content/savelm3.p /content/gdrive/MyDrive
If you are on a different machine or new machine (Colab)
import pickle
copy the headless model from above to directory models but note it ignores path
!mkdir /content/models
!cp /content/gdrive/MyDrive/finetunedE.pth /content/models
#copy the Data Loader with the vocab
!cp /content/gdrive/MyDrive/savelm.p /content
dls_lm = pickle.load( open( “/content/savelm.p”, “rb” ) )
…
Now prepare you actual text but point to the original vocab pickle imported above
…
dlsr = TextDataLoaders.from_df(df=dfr, text_vocab=dls_lm.vocab,text_col=‘Review’, label_col=‘Latency’, label_delim=";",y_block=MultiCategoryBlock,splitter=RandomSplitter(0.2) )
learnr = text_classifier_learner(dlsr, AWD_LSTM, drop_mult=0.5, n_out=len(dlsr.vocab[1]), metrics=[]).to_fp16()
Now use the imported headless model
learnr.load_encoder(‘finetunedF’) #Described in the Course Book
learnr.fit_one_cycle(1,2e-2)
learn.freeze_to(-2)
learnr.fit_one_cycle(1,slice(1e-2/(2.64),1e-2))
learn.freeze_to(-3)
learnr.fit_one_cycle(1,slice(5e-3/(2.64),5e-3))
learnr.unfreeze()
learnr.fit_one_cycle(2,slice(1e-3/(2.6**4),1e-3))