Hello i would have a question concerning exporting of a given datatransformation pipeline from a fastai learner. My idea would be if I export the learner.model => as a plain torch model to wrap all the preprocessing.
So instead of putting the the normalization in the nn.sequential as first layer i would like to put the learn.dls.valid.item_tfsm or something in as the first layer. That would be to ensure that every Transform i have like Resize or RandomResizeCrop is used in the same way and automatically apllied in the first layer?
Does anyone have an idea if something like that would be possible’?
Thanks for the answer @Rohan_Lalwani yes they can be put inside a nn.sequential the question would be if i can export the item transforms of the learner in a standard way to wrap them also into the torch model?
So my idea would be standard transforms like centercropping or so they should be part of the resulting model so that learner.predict(x) and learner.model (after exporting model(x) ) would yield exactly the same result because they are applying the exact same transforms to the prediction data.
If you are looking to apply valid transforms on the data in inference without using fastai, You could first make a nn.Sequential that consists of all the transforms including normalization and then make predictions with the model without it being in a nn.Sequential
valid_tfms = nn.Sequential(*all_transforms)
model = learn.model
tfm_x = valid_tfms(ToTensor()(data)) # if the data isn't already a tensor
preds = model(tfm_x.cuda())
And then you could use softmax. Also, take care of the shape of the data.
The code might not be the same for you but this is how I would implement it.
I didn’t know that only pytorch subclasses can be put in nn.Sequential, what I meant previously was putting manually all the transforms in nn.Sequential but it won’t work. Here are two ways I know how to export fastai learner and apply transforms to data in inference:
When you export the learner, the data transforms get saved with it so you could use the following code: