How to create ImageDataBunch from datasets?

Could someone tell me, how can I create an ImageDataBunch with recent versions of fastai lib? I used to do something like this:

trn_ds = CustomDataset(path, train=True)
val_ds = CustomDataset(path, train=False)
tst_ds = TestDataset(path)
bunch = ImageDataBunch.create(trn_ds, val_ds, tst_ds, ds_tfms=get_transforms())
learn = create_cnn(bunch, models.resnet50)
learn.fit_one_cycle(1)

The CustomDataset generates images on the fly instead of reading them from a persistent storage. Therefore, I can’t use (or can?) new data API that expects images to sit inside of folders. And, also I tend to build things manually instead of using factory functions :slight_smile:

However, it is not possible anymore. Probably I just don’t understand the new API but is it possible to use these “low level” constructors to build data bunches? It seems that these functions are not expected to be used and are more like a private API. Could you please advise, how one can construct an ImageDataBunch from custom datasets and transformations?

I know that probably my request is not really relevant for most of the library’s users who have the standard structure of folders and data processing pipelines but I would really appreciate if somebody could give a hint on this topic.

1 Like

You can’t. You should build your custom input ItemList and/or target ItemList then use the data block API.

Ok, understood, thank you for the advice.