Lesson1: OSError: Is a directory: data/dogscats/valid/dogs/.ipynb_checkpoints

working from In jupyterlab console brings this error from clouderizer link


data = ImageClassifierData.from_paths(PATH, tfms=tfms_from_model(arch, sz))
learn = ConvLearner.pretrained(arch, data, precompute=True)
learn.fit(0.01, 2)


OSError Traceback (most recent call last)
in ()
1 arch=resnet34
2 data = ImageClassifierData.from_paths(PATH, tfms=tfms_from_model(arch, sz))
----> 3 learn = ConvLearner.pretrained(arch, data, precompute=True)
4 learn.fit(0.01, 2)

/content/clouderizer/fast.ai/fastai/courses/dl1/fastai/conv_learner.py in pretrained(cls, f, data, ps, xtra_fc, xtra_cut, custom_head, precompute, pretrained, **kwargs)
112 models = ConvnetBuilder(f, data.c, data.is_multi, data.is_reg,
113 ps=ps, xtra_fc=xtra_fc, xtra_cut=xtra_cut, custom_head=custom_head, pretrained=pretrained)
–> 114 return cls(data, models, precompute, **kwargs)
116 @classmethod

/content/clouderizer/fast.ai/fastai/courses/dl1/fastai/conv_learner.py in init(self, data, models, precompute, **kwargs)
98 if hasattr(data, ‘is_multi’) and not data.is_reg and self.metrics is None:
99 self.metrics = [accuracy_thresh(0.5)] if self.data.is_multi else [accuracy]
–> 100 if precompute: self.save_fc1()
101 self.freeze()
102 self.precompute = precompute

/content/clouderizer/fast.ai/fastai/courses/dl1/fastai/conv_learner.py in save_fc1(self)
179 predict_to_bcolz(m, self.data.fix_dl, act)
180 if len(self.activations[1])!=len(self.data.val_ds):
–> 181 predict_to_bcolz(m, self.data.val_dl, val_act)
182 if self.data.test_dl and (len(self.activations[2])!=len(self.data.test_ds)):
183 if self.data.test_dl: predict_to_bcolz(m, self.data.test_dl, test_act)

/content/clouderizer/fast.ai/fastai/courses/dl1/fastai/model.py in predict_to_bcolz(m, gen, arr, workers)
15 lock=threading.Lock()
16 m.eval()
—> 17 for x,*_ in tqdm(gen):
18 y = to_np(m(VV(x)).data)
19 with lock:

/usr/local/lib/python3.6/dist-packages/tqdm/_tqdm.py in iter(self)
935 “”", fp_write=getattr(self.fp, ‘write’, sys.stderr.write))
–> 937 for obj in iterable:
938 yield obj
939 # Update and possibly print the progressbar.

/content/clouderizer/fast.ai/fastai/courses/dl1/fastai/dataloader.py in iter(self)
86 # avoid py3.6 issue where queue is infinite and can result in memory exhaustion
87 for c in chunk_iter(iter(self.batch_sampler), self.num_workers*10):
—> 88 for batch in e.map(self.get_batch, c):
89 yield get_tensor(batch, self.pin_memory, self.half)

/usr/lib/python3.6/concurrent/futures/_base.py in result_iterator()
584 # Careful not to keep a reference to the popped future
585 if timeout is None:
–> 586 yield fs.pop().result()
587 else:
588 yield fs.pop().result(end_time - time.time())

/usr/lib/python3.6/concurrent/futures/_base.py in result(self, timeout)
423 raise CancelledError()
424 elif self._state == FINISHED:
–> 425 return self.__get_result()
427 self._condition.wait(timeout)

/usr/lib/python3.6/concurrent/futures/_base.py in __get_result(self)
382 def __get_result(self):
383 if self._exception:
–> 384 raise self._exception
385 else:
386 return self._result

/usr/lib/python3.6/concurrent/futures/thread.py in run(self)
55 try:
—> 56 result = self.fn(*self.args, **self.kwargs)
57 except BaseException as exc:
58 self.future.set_exception(exc)

/content/clouderizer/fast.ai/fastai/courses/dl1/fastai/dataloader.py in get_batch(self, indices)
74 def get_batch(self, indices):
—> 75 res = self.np_collate([self.dataset[i] for i in indices])
76 if self.transpose: res[0] = res[0].T
77 if self.transpose_y: res[1] = res[1].T

/content/clouderizer/fast.ai/fastai/courses/dl1/fastai/dataloader.py in (.0)
74 def get_batch(self, indices):
—> 75 res = self.np_collate([self.dataset[i] for i in indices])
76 if self.transpose: res[0] = res[0].T
77 if self.transpose_y: res[1] = res[1].T

/content/clouderizer/fast.ai/fastai/courses/dl1/fastai/dataset.py in getitem(self, idx)
201 xs,ys = zip(*[self.get1item(i) for i in range(*idx.indices(self.n))])
202 return np.stack(xs),ys
–> 203 return self.get1item(idx)
205 def len(self): return self.n

/content/clouderizer/fast.ai/fastai/courses/dl1/fastai/dataset.py in get1item(self, idx)
195 def get1item(self, idx):
–> 196 x,y = self.get_x(idx),self.get_y(idx)
197 return self.get(self.transform, x, y)

/content/clouderizer/fast.ai/fastai/courses/dl1/fastai/dataset.py in get_x(self, i)
298 super().init(transform)
299 def get_sz(self): return self.transform.sz
–> 300 def get_x(self, i): return open_image(os.path.join(self.path, self.fnames[i]))
301 def get_n(self): return len(self.fnames)

/content/clouderizer/fast.ai/fastai/courses/dl1/fastai/dataset.py in open_image(fn)
266 raise OSError(‘No such file or directory: {}’.format(fn))
267 elif os.path.isdir(fn) and not str(fn).startswith(“http”):
–> 268 raise OSError(‘Is a directory: {}’.format(fn))
269 elif isdicom(fn):
270 slice = pydicom.read_file(fn)

OSError: Is a directory: data/dogscats/valid/dogs/.ipynb_checkpoints

removed files from google drive and removed project from clouderizer and redid it yet the lesson1.ipynb somehow was the same as the previous setup. How is that possible? How do i make a clean slate?

As I was writing this, with the colaboratory app from with file from drive it now worked . However external code which I added before scrapping the google drive folder to check gpu memory and removing and replacing clouderizer fast.ai project was present in the notebook?

Also the fresh notebook used for the ubuntu !wget linking to clouderizer is perpetually working showing [Thread-463(etc)] INFO com.clouderizer.client.utils.Logger - ERROR>2018/09/26 16:51:25 NOTICE: tutorials/fastai: Can’t follow symlink without -L/–copy-links. Whats up with that?

If you scroll to the very end of that loooong stack trace, it tells you, ultimately, what the error was: Inside the folder that should have all the dog images, there was a folder .ipynb_checkpoints, which confuses the data loader, because it doesn’t want to see folders inside the validation set category folders.

This folder gets created in the same folder as the jupyter notebook file you’re currently working on, so I assume that you accidentally started / opened jupyter inside that folder.