OS Error : Is a directory

Hi everyone! I’m a new fast.ai student and have started with deep learning course #1. After watching lectures 1 and 2, I have been trying to apply pre-trained models to MRI brain scans.

The problem statement in this case is to use the set of MRI images to predict whether an image contains a tumor or not.

The data set (BraTS 2013) contains 3584 files. Each file contains an MRI image of the right half of the brain in .png format. The image shape is 256x128 and each image is in the gray scale format. The data set has been split into train and valid . The files in both folders have been further split into two folders - tumor and notumor, corresponding to whether the image contains a tumor or not .

To begin with, I have uploaded the folder and unzipped it in google colab. However, when I try to train the model, I get the error which is shown in the image:

I am not an experienced Python programmer or with deep learning, so can’t make much sense of the error. So far, I have tried the following:

  1. Changing the size (sz) variable,
  2. Changing model architecture to vgg16,
  3. Setting precompute = False

but the error doesn’t go away. I’d be very grateful if someone could help with this.

Thanks in advance :slight_smile:

An error appears to be encountered while reading an image. You may have a dud one somewhere, or something in your paths looks like an image but cv2 can’t read it.

You can try: a) remove and replace with originals the source images again. b) visibly inspect all images. c) write a python loop that Cv2.imread’s files in the image path. d) insert debug statements in the fastai source to provide a better (ie filename fingering) error message.

Don’t be too shy about programming inexperience, StackOverflow usually has a way to approach almost any problem. Eg take code from the following, adding the .astype(np.float32)/255 from your error to the imread statement.

Visibly inspecting all images in the folder worked! Thanks!
It turns out some hidden desktop.ini and .ipynb_notebook files had crept up. Deleted them and everything works perfectly fine.

Thanks again :slight_smile: