[SOLVED] Can't Get dls.show_batch Working with Mid-level Data API

I am working on Kaggle Ship Detection Competition (image segmentation problem).

Taking reference from 24_tutorial.siamese.ipynb, I am trying to write a mid-level data API for the dataset but I can’t get dls.show_batch properly working.

The following are the errors I got. To view the full code, you can view it on my notebook (session 4. TfmdLists to DataLoaders):

dls.show_batch();

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-155-988852a6dc95> in <module>
----> 1 dls.show_batch();

~/fastai/fastai2_walkthrough/fastai2/fastai2/data/core.py in show_batch(self, b, max_n, ctxs, show, unique, **kwargs)
     97         if b is None: b = self.one_batch()
     98         if not show: return self._pre_show_batch(b, max_n=max_n)
---> 99         show_batch(*self._pre_show_batch(b, max_n=max_n), ctxs=ctxs, max_n=max_n, **kwargs)
    100         if unique: self.get_idxs = old_get_idxs
    101 

~/fastai/fastai2_walkthrough/fastcore/fastcore/dispatch.py in __call__(self, *args, **kwargs)
     96         if not f: return args[0]
     97         if self.inst is not None: f = MethodType(f, self.inst)
---> 98         return f(*args, **kwargs)
     99 
    100     def __get__(self, inst, owner):

~/fastai/fastai2_walkthrough/fastai2/fastai2/data/core.py in show_batch(x, y, samples, ctxs, max_n, **kwargs)
     12 def show_batch(x, y, samples, ctxs=None, max_n=9, **kwargs):
     13     if ctxs is None: ctxs = Inf.nones
---> 14     if hasattr(samples[0], 'show'):
     15         ctxs = [s.show(ctx=c, **kwargs) for s,c,_ in zip(samples,ctxs,range(max_n))]
     16     else:

TypeError: 'NoneType' object is not subscriptable

Do any fellows have any idea how to make it work?

I had an error like yours. Looks like is not being @typdispatched

@WaterKnight thanks for the input. I finally fixed the bug.

To explain a bit, the error I mentioned in first thread (i.e. TypeError) suggests that @typedispatch is not properly enforced on show_batch. This issue is naturally fixed when I restarted the kernel.

After that, I change a bit of my custom data class to make it work, the changes are highlighted below:

class LabeledImage(Tuple):
    def show(self, ctx = None, **kwargs):
        img, mask = self
        # [change 1] in batch, we get Tensor that can't show
        # therefore, we needa change it back to TensorImage, TensorMask
        img, mask = TensorImage(img), TensorMask(mask)
        # [change 2] it's important to let argument ctx propagate to your plot to be output
        ctx1 = img.show(ctx = ctx)
        return mask.show(ctx = ctx1, **kwargs)

For those who wanna further know the details, you could refer to my updated notebook

I had a similar error when following something similar to the Siamese Tutorial.

My explanation of the error is that I used the ToTensor() transform, when building the DataLoaders and that transform can not decode() back to my custom class.

The solution here with overriding the dls.show_batch() worked for me.