Chapter 10 NLP too long to train

Hi folks!

In Chapter 10 ‘NLP’ i ran into an issue: model takes too long to train - 42 min for 1 epoch.
Here is the code from book:

get_imdb = partial(get_text_files, folders=['train', 'test', 'unsup'])

dls_lm = DataBlock(
    blocks=TextBlock.from_folder(path, is_lm=True),
    get_items=get_imdb, splitter=RandomSplitter(0.1)
).dataloaders(path, path=path, bs=128, seq_len=80)

learn = language_model_learner(
    dls_lm, AWD_LSTM, drop_mult=0.3, 
    metrics=[accuracy, Perplexity()]).to_fp16()

learn.fit_one_cycle(1, 2e-2)

Later in the chapter we are fine-tuning the model after unfreezing it for 10 epochs, which i cannot afford:

learn.fit_one_cycle(10, 2e-3)


How do i make it train for less time? I figured, i could limit the number of entries to dls_lm. This would affect accuracy of the model but im ok with it. But i cannot figure out how to do it.
Or mb there is another way to train for less time?

1 Like

No real ways to make it go faster, I’m afraid. LSTM, RNN, and NLP model architectures in general take longer (and more GPU-mem) to train.

Not sure what you are trying to do here, but if you are not bothered by the accuracy, and just want to go through the steps, you could just train for fewer epochs? For example, after unfreezing just do learn.fit_one_cycle(1, 2e-3) instead of 10 epochs. It’s mainly to show you what are the training metrics and changes that you should watch out for anyways, so you could also just look at the output in the provided notebooks, if you cannot actually go through all the training steps on your machine.

Good luck.


1 Like

Thanks for the answer!
That’s what i eventually did :slight_smile:

But the question still bothers me: how do i limit number of entries in dls? Is there an elegant way to do it?

Hi Yorick Hope your are having a wonderful day!

You could use callbacks save each epoch to a drive such as Gdrive on Google Colab and reload the saved output.

If you use Google Colab and you have code to operate the mouse to stop the session from timing out, you can do this while you sleep.

Cheers mrfabulous1 :smiley: :smiley: