Importing fastai model into pyTorch format for Lambda

I have developed a tabular fastai deep learning model. Since lambda does not support fastai, I need to import the model to pyTorch and do the prediction. I can get the model state using:
state = torch.load(‘models/mymodel.pth’, map_location=torch.device(‘cpu’))

But I think this is not the full model. How can I import the model and make the prediction? I think I need to import it differently, or maybe design the structure again on pyTorch.
I have gone through the forum but the answers on similar topics are not clear enough.

@matt.mcclean

Why do you believe it’s not the full model? All the layers point back to PyTorch so it should be usable. Also: the model (unsure if you looked) accepts the cat and cont variables as two separate tensors for your input. Also how are you preparing your prediction tensor?

1 Like

Thanks Zachary for your reply.
I am not fully familiar with pyTorch (unlike fastai). I loaded the model, but the data type is dict which has all the fastai model layers and info (is this accurate? [Solved] Using a fastai-trained model with plain Pytorch).
How can I make the model from this dictionary? and how can I make predictions using it?
In fastai everything was simple:

fastai_prediction = learn_nn.predict(data.iloc[0])

where learn_nn is the NN model. What is the equivalent process in pyTorch?
I think what you refer to as prediction tensor is equivalent of data.iloc[0] in this example, but I am stuck in one step before that.