Inference on test images

there is a guide on how to predict on a single image

however how to do inference on a test set (say 20k images)?
i don’t have the training data anymore as it was big set i deleted the files and don’t want to download them again. i have weights saved and classes saved. is there a way to do it?

i tried this code:

    empty_data = ImageDataBunch.single_from_classes(path, data_classes, tfms=get_transforms()).normalize(imagenet_stats)
    learn = create_cnn(empty_data, models.resnet34)
    learn = learn.load('stage-1')

where data_classes were classes from training on training set

however i got the error when running

preds = learn.TTA()


TypeError                                 Traceback (most recent call last)
<ipython-input-16-47212b3e53d5> in <module>()
----> 1 preds = learn.TTA()

~/.anaconda3/lib/python3.7/site-packages/fastai/vision/ in _TTA(learn, beta, scale, ds_type, with_loss)
     32 def _TTA(learn:Learner, beta:float=0.4, scale:float=1.35, ds_type:DatasetType=DatasetType.Valid, with_loss:bool=False) -> Tensors:
---> 33     preds,y = learn.get_preds(ds_type)
     34     all_preds = list(learn.tta_only(scale=scale, ds_type=ds_type))
     35     avg_preds = torch.stack(all_preds).mean(0)

~/.anaconda3/lib/python3.7/site-packages/fastai/ in get_preds(self, ds_type, with_loss, n_batch, pbar)
    209         lf = self.loss_func if with_loss else None
    210         return get_preds(self.model, self.dl(ds_type), cb_handler=CallbackHandler(self.callbacks),
--> 211                          activ=_loss_func2activ(self.loss_func), loss_func=lf, n_batch=n_batch, pbar=pbar)
    213     def pred_batch(self, ds_type:DatasetType=DatasetType.Valid, pbar:Optional[PBar]=None) -> List[Tensor]:

~/.anaconda3/lib/python3.7/site-packages/fastai/ in get_preds(model, dl, pbar, cb_handler, activ, loss_func, n_batch)
     36     "Tuple of predictions and targets, and optional losses (if `loss_func`) using `dl`, max batches `n_batch`."
     37     res = [ for o in
---> 38            zip(*validate(model, dl, cb_handler=cb_handler, pbar=pbar, average=False, n_batch=n_batch))]
     39     if loss_func is not None: res.append(calc_loss(res[0], res[1], loss_func))
     40     if activ is not None: res[0] = activ(res[0])

~/.anaconda3/lib/python3.7/site-packages/fastai/ in validate(model, dl, loss_func, cb_handler, pbar, average, n_batch)
     47     with torch.no_grad():
     48         val_losses,nums = [],[]
---> 49         for xb,yb in progress_bar(dl, parent=pbar, leave=(pbar is not None)):
     50             if cb_handler: xb, yb = cb_handler.on_batch_begin(xb, yb, train=False)
     51             val_losses.append(loss_batch(model, xb, yb, loss_func, cb_handler=cb_handler))

~/.anaconda3/lib/python3.7/site-packages/fastprogress/ in __iter__(self)
     63         self.update(0)
     64         try:
---> 65             for i,o in enumerate(self._gen):
     66                 yield o
     67                 if self.auto_update: self.update(i+1)

~/.anaconda3/lib/python3.7/site-packages/fastai/ in __iter__(self)
    114     def __iter__(self):
    115         "Process and returns items from `DataLoader`."
--> 116         for b in self.dl:
    117             y = b[1][0] if is_listy(b[1]) else b[1]
    118             if not self.skip_size1 or y.size(0) != 1:

~/.anaconda3/lib/python3.7/site-packages/torch/utils/data/ in __next__(self)
    635                 self.reorder_dict[idx] = batch
    636                 continue
--> 637             return self._process_next_batch(batch)
    639     next = __next__  # Python 2 compatibility

~/.anaconda3/lib/python3.7/site-packages/torch/utils/data/ in _process_next_batch(self, batch)
    656         self._put_indices()
    657         if isinstance(batch, ExceptionWrapper):
--> 658             raise batch.exc_type(batch.exc_msg)
    659         return batch

TypeError: Traceback (most recent call last):
  File "/home/nbuser/.anaconda3/lib/python3.7/site-packages/torch/utils/data/", line 138, in _worker_loop
    samples = collate_fn([dataset[i] for i in batch_indices])
  File "/home/nbuser/.anaconda3/lib/python3.7/site-packages/torch/utils/data/", line 138, in <listcomp>
    samples = collate_fn([dataset[i] for i in batch_indices])
  File "/home/nbuser/.anaconda3/lib/python3.7/site-packages/fastai/vision/", line 237, in __getitem__
    x,y = self.ds[idx]
  File "/home/nbuser/.anaconda3/lib/python3.7/site-packages/fastai/", line 49, in __getitem__
    x = self._get_x(i)
  File "/home/nbuser/.anaconda3/lib/python3.7/site-packages/fastai/", line 44, in _get_x
    def _get_x(self,i):   return self.x[i]
TypeError: 'NoneType' object is not subscriptable
1 Like

can you create DataBunch with test folder but without train folder or train labels? Can you share code on how did you do it? thanks!

I don’t think so. But you don’t need to train the model again. Just load the weights and start predicting.

thanks @ssaxena7
unfortunately i’ve deleted the train data, it was huge set 500GB. i could point to some dummy train data as workaround but i would hope there is a way in fastai to do this.

i’ve tried pointing to ‘dummy’ train and valid folders with just one image in each and used DataImageBunch.from_folder
but it didn’t work as the number or classes didn’t match between the saved model and the dummy train and valid folders

so i would appreciate any help on this!

did you try something like :

empty_data = ImageDataBunch.single_from_classes(path, data.classes, 
learn = create_cnn(empty_data, models.resnet18)
learn = learn.load('one_epoch')

yes i have tried this, as per top post :slight_smile:

see the top post for the error :point_up_2:

Hi @miwojc . Did you find the answer for adding test set from empty data bunch ? I am interested in this as well.

Thank you in advance

Is there a workable solution for this issue?

didn’t try it yet, but you could try the data block api for that: