If the problem is only in predict, I’d create a subclass of Learner and deal with predict (or get_preds if the problem comes from there) by overriding the method, like we do for the language model (predict) or for tabularLearner (get_preds).
While your Bucket solution works, I feel it’s a lot of magic
Are your images already of the same size? I’m failing to see where you modified the collate function of your dataloader, and without changing that it cannot collate images of different sizes
I see that, but the dataloader should not even work if the images are of different sizes, before the learner get’s to say anything, this is what I’m failing to understand.
Can you explain in your case how a batch is collated when let’s say, if have an image of size (224,224) and another of size (512, 512)?
Okay, this kinda of worked, but some other problems showed up.
First problem was the above mentioned find_bs, it wasn’t so trivial to solve as I was expecting, I ended up having to monkey patching it anyways…
Another problem is in GatherPredsCallback, here:
95 if self.with_input: self.inputs = detuplify(to_concat(self.inputs, dim=self.concat_dim))
96 if not self.save_preds: self.preds = detuplify(to_concat(self.preds, dim=self.concat_dim))
---> 97 if not self.save_targs: self.targets = detuplify(to_concat(self.targets, dim=self.concat_dim))
98 if self.with_loss: self.losses = to_concat(self.losses)
targets is a list of dicts, to_concat uses is_listy…
is_listy, is_iter are the reason I created the buckets in the first place, now trying to remove the buckets gets me in the initial condition again (now in a different situation)
Any ideas? Buckets are starting to look not so bad after all…