How can I use an ImageDataLoader to get test the accuracy of a trained learner

I have my model trained and I want to use a labeled test set to get the real accuracy.

I’m trying to run a code like this:

data_test = ImageDataLoaders.from_df(df=df_styles_test, path=pictures_dir, valid_pct=0,
                              label_col='style', fn_col='new_filename',
                              y_block=CategoryBlock(),
                              item_tfms=Resize(299),
                              batch_tfms=Normalize.from_stats(*imagenet_stats),
                              bs=64
                              )

learner.get_preds(dl=data_test)

And I’m getting this error:

---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
<ipython-input-48-c83793a06c54> in <cell line: 1>()
----> 1 learner.get_preds(dl=data_test)

3 frames
/usr/local/lib/python3.10/dist-packages/fastai/torch_core.py in nested_reorder(t, idxs)
    776 def nested_reorder(t, idxs):
    777     "Reorder all tensors in `t` using `idxs`"
--> 778     if isinstance(t, (Tensor,L)): return t[idxs]
    779     elif is_listy(t): return type(t)(nested_reorder(t_, idxs) for t_ in t)
    780     if t is None: return t

IndexError: index 19997 is out of bounds for dimension 0 with size 19968

This is a link to the colab that I’m running: Google Colab

Thank you in advance :slight_smile:


Update


I solved that this way:

def test_prob(test_df, test_dir, learner_to_test):

  aciertos=0
  total=len(test_df);

  for i in range(total):

    archivo = test_df.iloc[i]['new_filename']
    estilo = test_df.iloc[i]['style']
    img = load_image(test_dir/archivo)
    pred = learner_to_test.predict(img)

    if(estilo==pred[0]):

      aciertos=aciertos+1

  porcentaje=aciertos/total*100
  print("El pocentaje de acierto en el conjunto de test es de: ",porcentaje,"%", sep='')

It’s not a good way to solve it cause it takes a lot of time to get the test accuracy but it worked for me.