Attempting to export a text model to ONNX

I have trained a text model on the toxic comments dataset according to this example :
Censoring toxic comments using fastai v2 with a multi-label text classifier | by Vinayak Nayak | Towards Data Science

I can easily load the model and run inferences against it:

inf = load_learner("/dbfs/mnt/levi/toxic_comment_clr.pk1")
inf.predict("I hate all those who don't use fastai")

Which yields the following output:

((#1) ['toxic'],
 tensor([False, False, False, False, False, False,  True]),
 tensor([0.0527, 0.2497, 0.0990, 0.0154, 0.2093, 0.0269, 0.7496]))

However, I can’t make heads or tails on how to export it to ONNX format, and the documentation isn’t very clear.

I suspect that this problem has something to do with how I’m specifying the second parameter, which is the input shape of my model, but I’m not absolutely certain.

What is the proper way to specify an input shape for ONNX when you’re just inputting a string. What am I not understanding here?

hello?

Have you tried with fastinference? (Or looking at the ONNX code for fastinference)

Although I’m not 100% sure that the AWDLSTM is ONNX compatible. I believe it is torch script compatible though

Yes, it is likely the second parameter. I suspect you need torch.randn(1,1) Which is a batch size of 1 and 1 token. But i would need to see the input of the shape from your dataloader.

e.g. when i run…
x,y = data.one_batch()
print(x.shape)
torch.Size([60, 3, 299, 299])

And then i use:
dummy_input = torch.randn(1, 3, 299, 299)