Exporting prediction to image fastai v2

I’m performing semantic segmentation with fastai v2. With v1, the export of the predicted mask is straightforward:

preds = learn.predict(example_image)

preds[0].save('output_path')

With v2, I’m trying to export the decoded mask, but I suspect I’m making an obvious error:

labelprobs, _, pixelargmax = learn.get_preds(dl = [test_dl.one_batch()], with_input = False, with_decoded = True)

to_image(pixelargmax[0])

Yields an error:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-309-b4aba87d3ee8> in <module>
----> 1 to_image(pixelargmax[0])

/opt/conda/lib/python3.7/site-packages/fastai/vision/core.py in to_image(x)
     76     "Convert a tensor or array to a PIL int8 Image"
     77     if isinstance(x,Image.Image): return x
---> 78     if isinstance(x,Tensor): x = to_np(x.permute((1,2,0)))
     79     if x.dtype==np.float32: x = (x*255).astype(np.uint8)
     80     return Image.fromarray(x, mode=['RGB','CMYK'][x.shape[0]==4])

/opt/conda/lib/python3.7/site-packages/fastai/torch_core.py in __torch_function__(self, func, types, args, kwargs)
    327         convert=False
    328         if _torch_handled(args, self._opt, func): convert,types = type(self),(torch.Tensor,)
--> 329         res = super().__torch_function__(func, types, args=args, kwargs=kwargs)
    330         if convert: res = convert(res)
    331         if isinstance(res, TensorBase): res.set_meta(self, as_copy=True)

~/.local/lib/python3.7/site-packages/torch/tensor.py in __torch_function__(cls, func, types, args, kwargs)
    993 
    994         with _C.DisableTorchFunction():
--> 995             ret = func(*args, **kwargs)
    996             return _convert(ret, cls)
    997 

RuntimeError: number of dims don't match in permute

Any pointers to a more appropriate way of exporting a predicted mask?

You can still follow learn.predict(), but otherwise I have it explicitly done in one of my lessons:

2 Likes

Ahh, thanks! Basically, I am interpreting these steps as, “it’s not part of the fastai API anymore; encode it as you please.” Thanks for the nice examples!

1 Like