@muellerzr, I have been playing a little bit with the fastai library in the last months. One thing I have realized is that the logic of fastai does not allow to export many things to PyTorch. So if I want to use Fastai to train the model, seems that I need to stick using fastai for inference as well—which might not be so convenient.
One clear example is the Learner class. It is a useful class for training, but in fastai it is also used for prediction. What if I want to send my trained model to my friend who just knows about PyTorch and not about fastai? This seems quite hard to do at the moment.
Besides all the preprocessing transformations on the input data are stored in the datasets/dataloaders and not in the model itself. So If I save the model with torch.save(learn.model, 'model.pth')
I still can not use it because there is some preprocessing of the data which is stored in the datasets instances.
In my opinion Learner
is a class which should be used only for training and then save the model separated from the Learner. In that way I can send my model to other people who might use PyTorch and not Fastai. Besides there are so many useless objects in the Learner class at inference time:
- optimizer,
- dataloaders,
- etc.
We do not need all this for inference.
Do you plan on changing this? Anyone else encountered the same issues?