Using the current version of fastai, attempting to show a batch produces this error:
RuntimeError: Error when trying to collate the data into batches with fa_collate, at least two tensors in the batch are not the same size.
Mismatch found on axis 1 of the batch and is of type `TensorBBox`:
Item at index 0 has shape: torch.Size([6, 4])
Item at index 1 has shape: torch.Size([1, 4])
Please include a transform in `after_item` that ensures all data of type TensorBBox is the same size
If I install the versions listed on @muellerzr 's walkwith page (fastai==2.1.10 fastcore==1.3.13 wwf==0.0.8), I get an initial import error: ModuleNotFoundError: No module named 'torchvision.models.utils'
… for which there is a workaround, which then leaves me with this error for pascal.summary:
TypeError: no implementation found for 'torch.Tensor.__getitem__' on types that implement __torch_function__: [<class 'fastai.torch_core.TensorMultiCategory'>, <class 'fastai.vision.core.TensorBBox'>]
I ran into the same error and solved it. This issue was opened a long time ago but maybe this will help someone.
The reason for this is the shape difference in TensorBBox objects because not every image has the same number of objects. This problem is solved automatically when using BBoxBlock in a DataBlock. However when using Datasets object, add before_batch=bb_pad when creating a dataloders object for it. Make sure that you set add_na=True in MultiCategorize() otherwise the first class will be labelled to every image because bb_pad associates the first class with the padded bounding boxes.
I’m running into something SIMILAR, but not the same. I have created a custom DataLoader class and a custom TfmdDL class (just to overload shot_batch) to create my train and valid datasets, but when I try to use something like dls.one_batch() I get that error, but with a twist.
RuntimeError: Error when trying to collate the data into batches with fa_collate, at least two tensors in the batch are not the same size.
Mismatch found on axis 1 of the batch and is of type `TensorBBox`:
Item at index 0 has shape: torch.Size([3, 15, 4])
Item at index 1 has shape: torch.Size([3, 12, 4])
Please include a transform in `after_item` that ensures all data of type TensorBBox is the same size
Note that my BS == 3, so it seems that one_batch is trying to collate TWO batches together.
The batches do use the bb_pad and when i check the datablock.summary() it correctly shows I am padding the tensorBBox and labels, but I am not getting why dls.one_batch() is trying to collate two batches together…
Anyone have any ideas?