Why "out = model(*xb)" in loss_batch()?

The context is a custom Dataset that generates a tuple as input to the model. (In case it matters, the tuple contains an image and a statistic about the image as float.)

one_batch() correctly returns a batched tuple of two tensors, like (32 images, 32 floats).

My sequential model has a first layer that separates the tuple and returns just the image to the next layer.

However, fastai unpacks the tuple into an argument list, sending two parameters to the model. The model barfs on Sequential’s forward method expecting one parameter.

Can anyone tell me…

  1. What is the rationale behind unpacking the tuple into extra parameters when sending it to forward()?

  2. Is there a better way to accomplish this task?

Thanks for your help!

Do you need those float values in your model? If you don’t then just update the __getitem__ in your Dataset, and if you want then with sequential it won’t work because sequential does not handle multiple inputs. You need to create a model class with nn.Module.

I think the rationale is that it is supposed to enable multiple inputs in models (like siamese networks). Whenever xb is a single tensor, it is turned into [xb] before getting passed to the model, therefore working normally. In your case, you could just define your forward function so that it takes 2 inputs as parameters, it should work just fine. It is a choice of design, but I think it makes sense that lists get unpacked before getting to the forward function.

As you suggest, I worked around by making a custom Module that takes two parameters and internally calls the Sequential on one of them.

Reasonable theory about design tradeoffs. However, it means that fastai can’t send list inputs to Sequential, the usual way to compose layers.

In any case, you could easily create a universal Module to prepend to your Sequential that merely repacks its multiple parameters.

That is IMO a limitation of PyTorch, not fastai. PyTorch should accept layers that take more than one input in Sequential as long as they can be chained properly.