Empty dataset error (IndexError: index 0 is out of bounds for axis 0 with size 0)

Hi,
I am building a CNN to detect Parkinson’s Disease. It works fine on Kaggle, but on Ubuntu WSL, it throws an error:

/home/gideongrinberg/.local/lib/python3.6/site-packages/fastai/data_block.py:442: UserWarning: Your training set is empty. If this is by design, pass `ignore_empty=True` to remove this warning.
  warn("Your training set is empty. If this is by design, pass `ignore_empty=True` to remove this warning.")
/home/gideongrinberg/.local/lib/python3.6/site-packages/fastai/data_block.py:445: UserWarning: Your validation set is empty. If this is by design, use `split_none()`
                 or pass `ignore_empty=True` when labelling to remove this warning.
  or pass `ignore_empty=True` when labelling to remove this warning.""")
Traceback (most recent call last):
  File "app/models/train.py", line 11, in <module>
    data = (ImageList.from_folder(path)               #Get data from path
  File "/home/gideongrinberg/.local/lib/python3.6/site-packages/fastai/data_block.py", line 463, in _inner
    self.train = ft(*args, from_item_lists=True, **kwargs)
  File "/home/gideongrinberg/.local/lib/python3.6/site-packages/fastai/data_block.py", line 292, in label_from_folder
    label_cls=label_cls, **kwargs)
  File "/home/gideongrinberg/.local/lib/python3.6/site-packages/fastai/data_block.py", line 287, in label_from_func
    return self._label_from_list([func(o) for o in self.items], label_cls=label_cls, **kwargs)
  File "/home/gideongrinberg/.local/lib/python3.6/site-packages/fastai/data_block.py", line 262, in _label_from_list
    label_cls = self.get_label_cls(labels, label_cls=label_cls, **kwargs)
  File "/home/gideongrinberg/.local/lib/python3.6/site-packages/fastai/data_block.py", line 251, in get_label_cls
    it = index_row(labels,0)
  File "/home/gideongrinberg/.local/lib/python3.6/site-packages/fastai/core.py", line 250, in index_row
    return a[idxs]

This is the code to load the databunch:

path = Path('./dataset/')

bs = 64
size = 224
num_workers = 0

tfms = get_transforms()                               #Do standard data augmentation
data = (ImageList.from_folder(path)               #Get data from path
        .split_by_rand_pct()                        #Randomly separate 20% of data for validation set
        .label_from_folder()                          #Label based on dir names
        .transform(tfms, size=size)                   #Pass in data augmentation
        .databunch(bs=bs, num_workers=num_workers)    #Create ImageDataBunch
        .normalize(imagenet_stats))                   #Normalize using imagenet stats

Any help is much obliged!