Spacy is by far the biggest lib depencency in fastai… around 1Gb. For comparisson, torch is about 250Mb.
It seems that we use it basically for training, is it possible to somehow prevent loading it when we only want/need to predict?
In our study group we wanted to deploy our language model in AWS Lambda but there is a limit on code size and we had to not use fastai, used torch directly.
copied from: https://forums.fast.ai/t/lesson-4-advanced-discussion/30319/19?u=fredguth