I’m a little stumped here. I was running through the Chapter 10 notebook, and when fine tuning the first epoch of the language prediction model, I’d always get a hang at some point.
.summary(path) on the DataBlock before creating DataLoaders/trying to train revealed this error about failing to create a batch because of differently-sized tensors, but shouldn’t it be adding padding to account for that?
Bizarrely, I had no issues training when running on a Paperspace Free-GPU instance with an older version of Fastai (2.1.10) instead of my local machine (Ubuntu 20.04 on WSL2 + RTX 3090, Fastai 2.3.0)… despite
DataBlock.summary() revealing the same error on both!
If I create the TextBlocks with
is_lm=False, it does add the padding correctly, though I haven’t figured out how to properly create the dependent variable and train quite yet.