Hi Jeremycochoy
I followed your approach to compile unet, and got an error. here is my code
learn = unet_learner(data, models.resnet34, metrics=dice, wd=wd).to_fp16()
# after training
learn.to_fp32() # put back to full floating point
trace_input = torch.ones(1,3,599,599).cuda()
jit_model = torch.jit.trace(learn.model, trace_input)
model_file='unit_jit.pth'
torch.jit.save(jit_model, f'models/{model_file}')
I got following warning
/home/ubuntu/anaconda3/envs/pytorch_p36/lib/python3.6/site-packages/fastai/vision/models/unet.py:32: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can’t record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
if ssh != up_out.shape[-2:]:
and this error
~/anaconda3/envs/pytorch_p36/lib/python3.6/site-packages/fastai/layers.py in forward(self, x)
170 self.dense=dense
171
--> 172 def forward(self, x): return torch.cat([x,x.orig], dim=1) if self.dense else (x+x.orig)
173
174 def res_block(nf, dense:bool=False, norm_type:Optional[NormType]=NormType.Batch, bottle:bool=False, **conv_kwargs):
RuntimeError: invalid argument 0: Sizes of tensors must match except in dimension 1. Got 599 and 600 in dimension 2 at /opt/conda/conda-bld/pytorch_1556653099582/work/aten/src/THC/generic/THCTensorMath.cu:71
the shape is 599x600 instead of 599x599.
did you run into any issues when you compile your model?
Thanks
Dong