How to add title in show_batch for SegmentationDataLoaders

Hi everybody!
I’m new to fastai and I’m trying to experiment with semantic segmentation. I managed to set up the data loader correctly using the DataBlock API with something like:

from fastai.vision.all import *
path = DATA_PATH_r / 'v1.0/crops_512'
def label_func(p):
    return Path(str(p).replace('images', 'masks'))

codes = np.loadtxt('codes.txt', dtype=str)

tfms = [
    IntToFloatTensor(div_mask=255),  # need masks in [0, 1] format
    *aug_transforms(
        max_lighting=0.2, p_lighting=0.5,
        min_zoom=0.9, max_zoom=1.1,
        max_warp=0, max_rotate=15.0)
]
set_seed(32,True)
db = DataBlock(blocks=(ImageBlock(), MaskBlock(codes)),
               batch_tfms=tfms,
               get_items=get_image_files, get_y=label_func)
dls = db.dataloaders(source=path/'images', bs=2)

What I want to do now is to play a bit with augmentation and see what are the effects of the various transformations available. However, that is not straightforward just using show_batch() and I would need some more customization:

  • turn off masks superposition and transparency
  • show filenames as titles so that I can compare with the original (I figured out filenames can be accessed from dls.train_ds.items, but it’s not clear to me how to pass multiple filenames as titles of different subplots)
  • possibly save subplots (which allows me to share augmentation results with domain experts for offline evaluation)

Do you have any suggestions on how to achieve that and/or how to customize show_batch in general?

Thanks in advance

Hi @lclissa,

the code examples in the data augmentation section of the fastai docs could be a good starting point for you.

You could create your own batch with def _batch_ex(bs) and then use your transforms like in the following examples. You can then create your own plots based on the batch you’ve created yourself.

If that is not sufficient, I guess you’ll have to create your own Callback. A good starting point would be the Siamese Model in the tutorials.

Cheers

1 Like

Hi @JackByte, thanks a lot for sharing this and thanks a lot to @sgugger for the great Siamese tutorial! :smiley: :smiley:

I took some time to experiment with it and I think I’m close but not quite there.
I managed to customize the superposition of masks and I found a quick way to save the result of show_batch as described in this post.

However, I’m still struggling to add the filename as its title. What I am trying is to create a custom type TitledImage (TensorImage + string) and leverage the data block API for the dataloader, taking care of appropriately dispatching show_batch for the new custom type.

Unfortunately, I’m stucked with a RecursionError: maximum recursion depth exceeded in __instancecheck__ in the dataloader. If I understood correctly it should have something to do with batch_to_samples for non-Tensor types, but I wasn’t able to debug further.
Do you have any idea what may be the problem?
Below there are a minimal example to reproduce the error and the full stack trace (for conciseness I omit show_batch dispatching since the error is the same, but of course I can integrate if needed):

from fastai.vision.all import *
path = DATA_PATH_r / 'v1.0/crops_512'
def label_func(p):
    return Path(str(p).replace('images', 'masks'))

# custom type: image + title
class TitledImage(fastuple):
    @classmethod
    def create(cls, fn):
        return cls(tuple((PILImage.create(fn), fn.name)))

    @classmethod
    def cast(cls, img, title):
        return cls(tuple((img, title)))

    def show(self, ctx=None, **kwargs):
        img, title = self
        return show_image(img, title=title, ctx=ctx, **kwargs)


# custom block
def TitledImageBlock():
    return TransformBlock(type_tfms=TitledImage.create, batch_tfms=[IntToFloatTensor])

# data block
titleseg = DataBlock(
    blocks=(TitledImageBlock(), MaskBlock()),
    get_items=get_image_files, get_y=label_func,
    item_tfms=Resize(224),
)

dls = titleseg.dataloaders(path, bs=3)
dls.show_batch()

Stack trace:

RecursionError                            Traceback (most recent call last)
<ipython-input-5-620c521f2fe1> in <module>
----> 1 dls1.show_batch()

~/anaconda3/envs/fastai/lib/python3.7/site-packages/fastai/data/core.py in show_batch(self, b, max_n, ctxs, show, unique, **kwargs)
    100         if b is None: b = self.one_batch()
    101         if not show: return self._pre_show_batch(b, max_n=max_n)
--> 102         show_batch(*self._pre_show_batch(b, max_n=max_n), ctxs=ctxs, max_n=max_n, **kwargs)
    103         if unique: self.get_idxs = old_get_idxs
    104 

~/anaconda3/envs/fastai/lib/python3.7/site-packages/fastai/data/core.py in _pre_show_batch(self, b, max_n)
     90         b = self.decode(b)
     91         if hasattr(b, 'show'): return b,None,None
---> 92         its = self._decode_batch(b, max_n, full=False)
     93         if not is_listy(b): b,its = [b],L((o,) for o in its)
     94         return detuplify(b[:self.n_inp]),detuplify(b[self.n_inp:]),its

~/anaconda3/envs/fastai/lib/python3.7/site-packages/fastai/data/core.py in _decode_batch(self, b, max_n, full)
     84         f1 = self.before_batch.decode
     85         f = compose(f1, f, partial(getattr(self.dataset,'decode',noop), full = full))
---> 86         return L(batch_to_samples(b, max_n=max_n)).map(f)
     87 
     88     def _pre_show_batch(self, b, max_n=9):

~/anaconda3/envs/fastai/lib/python3.7/site-packages/fastai/torch_core.py in batch_to_samples(b, max_n)
    599     if isinstance(b, Tensor): return retain_types(list(b[:max_n]), [b])
    600     else:
--> 601         res = L(b).map(partial(batch_to_samples,max_n=max_n))
    602         return retain_types(res.zip(), [b])
    603 

~/anaconda3/envs/fastai/lib/python3.7/site-packages/fastcore/foundation.py in map(self, f, gen, *args, **kwargs)
    152     def range(cls, a, b=None, step=None): return cls(range_of(a, b=b, step=step))
    153 
--> 154     def map(self, f, *args, gen=False, **kwargs): return self._new(map_ex(self, f, *args, gen=gen, **kwargs))
    155     def argwhere(self, f, negate=False, **kwargs): return self._new(argwhere(self, f, negate, **kwargs))
    156     def filter(self, f=noop, negate=False, gen=False, **kwargs):

~/anaconda3/envs/fastai/lib/python3.7/site-packages/fastcore/basics.py in map_ex(iterable, f, gen, *args, **kwargs)
    664     res = map(g, iterable)
    665     if gen: return res
--> 666     return list(res)
    667 
    668 # Cell

~/anaconda3/envs/fastai/lib/python3.7/site-packages/fastcore/basics.py in __call__(self, *args, **kwargs)
    649             if isinstance(v,_Arg): kwargs[k] = args.pop(v.i)
    650         fargs = [args[x.i] if isinstance(x, _Arg) else x for x in self.pargs] + args[self.maxi+1:]
--> 651         return self.func(*fargs, **kwargs)
    652 
    653 # Cell

... last 4 frames repeated, from the frame below ...

~/anaconda3/envs/fastai/lib/python3.7/site-packages/fastai/torch_core.py in batch_to_samples(b, max_n)
    599     if isinstance(b, Tensor): return retain_types(list(b[:max_n]), [b])
    600     else:
--> 601         res = L(b).map(partial(batch_to_samples,max_n=max_n))
    602         return retain_types(res.zip(), [b])
    603 

RecursionError: maximum recursion depth exceeded in __instancecheck__

Hi @lclissa,

Are you sure with type_tfms=TitledImage.create? Shouldn’t that be a Transform like SiameseTransform?

Maybe it’s a good idea to try to create the tls = TfmdLists(files, tfm, splits=splits) like in Making-show-work. And later try to plug that into an DataBlock.