Inference Segmentation result overlap

Hi there!
When I use my trained model for inference, more precisely to create an overlap of the predicted image and the segmentation result, they both dont seem to fit. The segmentation result mask does not properly represent the predicted image. However when I look at the learner.show_results() method, it seems to be correct…

this is how I train my network:

dblock = DataBlock(blocks = (ImageBlock, MaskBlock(codes)), 
                   get_y = create_mask, # create_mask is a custom function
                   splitter = RandomSplitter(),
                   item_tfms = RandomResizedCrop(224, min_scale=0.4),
                   batch_tfms = aug_transforms(mult=4, max_warp=0.1, pad_mode='zeros'))
dloaders = dblock.dataloaders(files, bs=16)
learn = unet_learner(dloaders, resnet34)
learn.fine_tune(20)
learn.export('learn20')

and this is how import the learner for inference:

from contextlib import contextmanager
import pathlib

@contextmanager
def set_posix_windows():
    posix_backup = pathlib.PosixPath
    try:
        pathlib.PosixPath = pathlib.WindowsPath
        yield
    finally:
        pathlib.PosixPath = posix_backup
with set_posix_windows():
    create_mask2 = lambda x: x # dummy function
    learn_inf = load_learner('learn20.pkl')

and this is how I overlay the images:

this_img = PILImage.create(img)
mask = learn.predict(this_img)[1]
this_mask = TensorMask(mask)
fig ,axs = plt.subplots(2,2, figsize=(12,10))

# plot overlay
this_img.show(ctx=axs[0, 1], title='prediction')
this_mask.show(ctx=axs[0, 1], cmap=custom_cmap, alpha=.55)