Converting UNET to Torchscript format for AWS Lambda

According to AWS Lambda deployment instructions on FASTAI, the SAM application expects a PyTorch model in TorchScript format to be saved to S3 along with a classes text file with the output class names.

Previously, I tried to use Torchscript with UNET and got errors due to its hooks. Has someone tried this deployment in the context of UNET or just UNET to torchscript conversion?

Have you solved it @shbkan ? I am getting same error!

I was able to trace my model using pytorch 1.4, but calling a prediction on it causes the lambda instance to timeout.

Tried dropping my pytorch version down to 1.1 to match the supplied AWS Lambda layer in the fastai tutorial deployment, and now get the same error you guys are getting (can’t trace a model with hooks)…