Image Bunch mixing train and test files?

Hi everyone!

I’m new on FastAI, and I was trying to load own images for a classification network. I’m working on linux (Shame on me, I know) The folder system looks like:
Data
-------train
-----------------01
-----------------02
-----------------03
----------------- …
-------test
-----------------01
-----------------02
-----------------03
----------------- …

In training, there are 5 folders on each from 01 to 05 with 1313 files on each, it does a training set of: 6565 files.

In test, there are 5 folders on each from 01 to 05 with 890 files on each, it does a training set of: 4450 files. I will use the test as a labeled_test thru the ‘valid’ parameter.

I execute these lines:
np.random.seed(1234)
data = ImageDataBunch.from_folder(path, train=“train”, valid_pct=0, size=224, num_workers=1,bs=batch_size).normalize(imagenet_stats)

then, executing this: data.train_ds gives:
LabelList (11015 items)
x: ImageList
Image (3, 224, 224),Image (3, 224, 224),Image (3, 224, 224),Image (3, 224, 224),Image (3, 224, 224)
y: CategoryList
01,01,01,01,01
Path: G:\LinuxFolder\Data\hintOrig

executing this: data.classes gives:
[‘01’, ‘02’, ‘03’, ‘04’, ‘05’]

So, it is taking the folders train and test as the training.

What am I missing here? Thank you a lot!

Finally solved the problem, using these lines instead:

data_train = (ImageList.from_folder(path/‘train’)
.split_by_rand_pct(seed=1983)
.label_from_folder()
.add_test_folder(path/‘test’)
.transform(size=224)
.databunch(path=path)
.normalize(imagenet_stats))

I still think there is something I’m missing on the other way.

Hope this can help somebody else!