A bit of an update. Looking at the source code of the Fastai functions, I worked out the shape of the sample input and managed to move a bit further along.

First I tried example = ((torch.randint(1, 10, (5,)), torch.tensor([0])),)

Which allowed the tracer to get into the forward function (the reason I had to use a tuple with an empty slot was the trace seemed to pre-unpack the tuple before passing it to the forward function which in turn was looking for a tuple).

However, this left me with the following error:

/opt/conda/envs/fastai/lib/python3.7/site-packages/fastai/text/learner.py in forward(self, input)

259

260 def forward(self, input:LongTensor)->Tuple[List[Tensor],List[Tensor],Tensor]:

–> 261 bs,sl = input.size()

262 self.reset()

263 raw_outputs,outputs,masks = [],[],[]

AttributeError: ‘tuple’ object has no attribute ‘size’

My assumption being, it wanted the input to actually be a tensor, so that it could call input.size()

I then tried a few different options and finally got the following to me accepted by both the PyTorch tracer and the forward function:

```
example = (torch.tensor([[1,2,3,4,5], [1,2,3,4,5]],device='cuda'),)
```

However, it spit out the following error:

RuntimeError: Only tensors or tuples of tensors can be output from traced functions (getOutput at /opt/conda/conda-bld/pytorch_1579022060824/work/torch/csrc/jit/tracer.cpp:212)

Which I take it as meaning that PyTorch can’t properly use the output of this function and I don’t really know if I can fix that. Therefore I am going to try a different model altogether and assume the FastAi NLP model can’t be traced at the moment.

Unless anyone has any other ideas?