Lesson2(v3) IndexError: index 0 is out of bounds for axis 0 with size 0

I’ve a image dataset properly labeled and split into train, test and valid folders but when I create an ImageDataBunch I’m prompted with the this error;

IndexError: index 0 is out of bounds for axis 0 with size 0

Although there’s another post in the forums about this same issue and there a possible solution to it but there’s no explanation to the error. Here’s the link to the other forum post.

I have the same problem on Kaggle

1 Like

For me it worked when i changed the path specification from ‘content…’ to ‘\content…’

1 Like

In my case, it is about the file name.

MNIST use the folder name ‘training’ and ‘testing’ instead of ‘train’ and ‘valid’ which is the default of from_folder.

It worked after I assign the folder name:
ImageDataBunch.from_folder(path, train=‘training’, valid=‘testing’, ds_tfms=tfms, size=26)

Reference from help(ImageDataBunch.from_folder):
from_folder(path: Union[pathlib.Path, str], train: Union[pathlib.Path, str] = ‘train’, valid: Union[pathlib.Path, str] = ‘valid’, valid_pct=None, classes: Collection = None, **kwargs: Any) -> ‘ImageDataBunch’ method of builtins.type instance
Create from imagenet style dataset in path with train,valid,test subfolders (or provide valid_pct).

4 Likes

What worked for me is I specified my path as follow (working on colab):

root_dir = ‘/content/drive/My Drive/fastai/’
path = Path(root_dir)
path = root_dir + ‘sleeves_lenghts/train’

I have added the test dataset to my databunch as follows:

data = ImageDataBunch.from_df(path_train_img, train_clf, ds_tfms=get_transforms(), size=256, bs=bs, test=path_test_img ).normalize(imagenet_stats)

I now want to get the predictions on the test set, so I’m using
preds, y, losses = learn.get_preds(ds_type=DatasetType.Test, with_loss=True)

which return the same error as you get:
IndexError: index 0 is out of bounds for axis 0 with size 0

is there any insight as to why this happens and how to fix it? BTW - my test data is unlabelled.

1 Like

I had the same problem. The path that I had specified was incorrect. It was pointing to the train folder instead of the parent of the train folder.

You are a saviour !
Thanks for this

I had the same error when trying to load the MNIST dataset. I was following this tutorial: https://docs.fast.ai/tutorial.data.html.

What worked for me was to change the folder names inside the mnist_png folder:

cd /home/jupyter/.fastai/data/mnist_png
mv training/ train/
mv testing/ valid/

Then, loading with

path = untar_data(URLs.MNIST)
data = (ImageList.from_folder(path)
        .split_by_folder()          
        .label_from_folder()
        .transform(tfms, size=32)
        .databunch()
        .normalize(imagenet_stats))

worked.

Please note that this leaves the dataset with no samples for testing.

I also got stuck in the same error.