Essentially I have a folder with images and a csv of the format “fn_col, label_col” and want to train an image classification model. In the old fastai v1 world i used to have a simple data setup work flow that used to work as follows-
TypeError: expected str, bytes or os.PathLike object, not numpy.ndarray
I have a feeling I’m doing multiple things wrong here, pretty sure my labels, resizing and transforms are all done incorrectly. Would really appreciate any pointers here!
10 frames
/usr/local/lib/python3.7/dist-packages/fastdebug/fastai/datasets.py in (.0)
59 t = getattr(self, ‘types’, [])
60 if t is None or len(t) == 0: raise Exception(“The stored dataset contains no items and self.types has not been setup yet”)
—> 61 types = L(t if is_listy(t) else [t] for t in self.types).concat().unique()
62 self.pretty_types = ‘\n’.join([f’ - {t}’ for t in types])
One last question, since i had to put the resize step back into the first datablock API call, that means every time I want to try out a different size of image and batch size, I’ll need to instantiate the whole block as opposed to just the data loader, is that correct? It’s not a big deal! Just wondering if there’s an easy way to pull the resizing into the dataloader creration step?