Train fastai models on Kaggle multi TPU cores

Hi all,
Related to another post on training fastai models on multi TPU cores on Colab,

I made a sample Kaggle notebook that trains fastai models on multi core notebooks.

I’ve also provided datasets for the packages to allow you train without an internet connection (a requirement for some competitions).

Hope you find it useful for your Kaggle competitions and explorations.

PS. Thanks @ilovescience for providing the kaggle pets dataset I used in the kaggle notebook (thanks also for the helpful pytorch xla kaggle tutorial notebooks :+1:)

4 Likes

Thanks for sharing this very useful example.

While fitting, I noticed that the TPU indicator always shows idle and seems like training is done on CPU even though TPU is selected as the accelerator. Is that expected?

I don’t know either. I do know it uses the tpu as there are times when I noticed that the tpu usage was not zero briefly. This may indicate that there is a bottleneck somewhere that prevents higher usage of the tpu. @ilovescience has a lot more experience than me in running pytorch xla on kaggle and might have better insight into this.