Exporting of the item transforms that are used during training/prediction

Hello i would have a question concerning exporting of a given datatransformation pipeline from a fastai learner. My idea would be if I export the learner.model => as a plain torch model to wrap all the preprocessing.

So instead of putting the the normalization in the nn.sequential as first layer i would like to put the learn.dls.valid.item_tfsm or something in as the first layer. That would be to ensure that every Transform i have like Resize or RandomResizeCrop is used in the same way and automatically apllied in the first layer?

Does anyone have an idea if something like that would be possible’?

You can put the transformations in the nn.Sequential, I think it would make no difference as nn.Sequential passes the data sequentially through what’s in it if I am not wrong.

Thanks for the answer @Rohan_Lalwani yes they can be put inside a nn.sequential the question would be if i can export the item transforms of the learner in a standard way to wrap them also into the torch model?

So my idea would be standard transforms like centercropping or so they should be part of the resulting model so that learner.predict(x) and learner.model (after exporting model(x) ) would yield exactly the same result because they are applying the exact same transforms to the prediction data.

If you are looking to apply valid transforms on the data in inference without using fastai, You could first make a nn.Sequential that consists of all the transforms including normalization and then make predictions with the model without it being in a nn.Sequential

valid_tfms = nn.Sequential(*all_transforms)
model = learn.model
tfm_x = valid_tfms(ToTensor()(data))  # if the data isn't already a tensor
with torch.no_grad():
    preds = model(tfm_x.cuda())

And then you could use softmax. Also, take care of the shape of the data.
The code might not be the same for you but this is how I would implement it.

Thanks again @Rohan_Lalwani for the answer!
My question would be how can i get the transforms out of learner.valid_dl.tfms to => *all_transforms that i can put it into the nn.Sequential?

The problem is that if i try to wrap the fastcore.transform.Pipeline into a nn.Sequential it just says it’s not a subclass so is there some way to export it /convert it?

I didn’t know that only pytorch subclasses can be put in nn.Sequential, what I meant previously was putting manually all the transforms in nn.Sequential but it won’t work. Here are two ways I know how to export fastai learner and apply transforms to data in inference:

  1. When you export the learner, the data transforms get saved with it so you could use the following code:
learn = load_learner("model.pkl", cpu=True/False)
test_dl = learn.dls.test_dl(test_files)  # To apply transforms
preds, _, decoded = learn.get_preds(dl=test_dl, with_decoded=True)

This is a great article about it: Inference With fastai - Model Saving, Loading, and Prediction | Just Stir It Some More

  1. You could use torchvision transforms wrapped in nn.Sequential instead of using fastai vision transforms. Something like this:
import torch
from torch import nn
from torchvision.transforms import CenterCrop, Normalize, ToTensor

valid_tfms = nn.Sequential(
    CenterCrop((224, 224)),

And then you can use the code I shared in my previous reply.

There could be more ways that I might not be aware of but the first one mainly is how inference is done with fastai. I might be able to help in a better way if you can share your code.

Hope this helps :slight_smile:

1 Like

As you learned last week, there is a third option as that still doesn’t recreate what fastai does :wink:

@buxdehude for a blind answer of what that takes see this nb: https://github.com/muellerzr/Walk-with-fastai-revisited/blob/main/05a_deployment_no_fastai.ipynb

And if you want to learn more as to why consider getting my course :slight_smile: https://store.walkwithfastai.com