ImageClassifierCleaner does not load the widgets

I am continuing from my first lesson of bird_or_forest and following along the second lesson on fastbook. I am trying to run the following cell:

cleaner = ImageClassifierCleaner (learn)
cleaner

Where learn is of type vision_learner
The output is stuck with a message

Loading widget…

There are no dropdown boxes to allow me to edit or delete.
Also, the images are printed in column not horizontal.
I am on Kaggel and I am on Firefox browser on Mac, if that matters.

3 Likes

Same issue, any suggestions ? have you got it resolved ?

Nope. That is still an open issue. There are some reports about this. My understanding so far is that it has got to do with ipywidgets. As suggested here, I tried to upgrade ipywidgets, and I got an incompatibility error.
This issue is discussed here too. But, no solution.
Please update this issue if have this resolved.

It seems to be working on Google Colab.

1 Like

Mine outputs like this:

KeyError Traceback (most recent call last)
File ~/miniconda3/envs/fastbook/lib/python3.10/site-packages/PIL/JpegImagePlugin.py:639, in _save(im, fp, filename)
638 try:
→ 639 rawmode = RAWMODE[im.mode]
640 except KeyError as e:

Anything useful?

2 Likes

Chiming in here. I can’t get ImageClassifierCleaner to work on kaggle or on Jupyter on my local machine. I’ve searched the forums and tried all the suggestions I’ve found.

Struggled with this myself for several days trying to get ImageClassifierCleaner running in Kaggle.

A few findings that may (or may not) be useful for others:

  • The ImageClassifierCleaner class depends on ipywidgets. It was unclear if ipywidgets works on Kaggle - it does. After installing various dependency combinations, I was able to get the FastAI Vision widgets working with the default, preinstalled versions of fastai and ipywidgets on Kaggle (as of 3/19/24)

** fastai version: 2.7.14
** ipywidgets version: 7.7.1

  • ImageClassifierCleaner does not seem to be able to handle large datasets. The best I could manage was 7 categories with 30 training examples in each (~210 images in total).

  • Loading 210 images into ImageClassifierCleaner was a struggle. I wasn’t able to edit the full data set at once - the widget would freeze when changing categories.

  • Re-running the cell with the cleaner code would allow me to continue editing to an extent. Eventually even re-running the cell would freeze before I could get through the full data set.

  • “Restart & Clear Cell Outputs” on the Kaggle session would allow me to restart clean and walk through the entire notebook again to resume cleaning. It took several iterations of restarting the notebook to fully clean all seven image categories.

  • Edits (delete/change) do not seem to be persisted on the cleaner object when changing categories. I needed to run cleaner.delete / cleaner.change before changing categories.

  • Limiting the maximum number of images displayed helped, cleaner = ImageClassifierCleaner(learn, max_n=5) for instance. I was able to progressively expand cleaning by increasing n_max and iterating. Ultimately I was forced to load the full dataset as I am not aware of any way to paginate the data for cleaning.

1 Like

Could not get this working at all in Kaggle. However, I clicked File → Open in Colab and was able to run ImageClassifierCleaner there.

1 Like

did you find any solutions for larger datasets?

Facing the same issue on Kaggle and I am surprised there isn’t a fix for this yet, even in July 2024.

For me I just see the loading indicator and then nothing at all

cleaner = ImageClassifierCleaner(learn)
cleaner

Here’s a screenshot if anyone’s interested:

My code looks like:

cleaner = ImageClassifierCleaner(learn, max_n=10)
cleaner

This helped me to finally see the images in kaggle. Then I edit the images for a particular class and then run the next line of code to do the deletion or movement.

for idx in cleaner.delete(): cleaner.fns[idx].unlink()
for idx,cat in cleaner.change(): shutil.move(str(cleaner.fns[idx]), path/cat)

Unfortunately once you do that you cannot rerun the cell with the ImageClassifierCleaner, to view the updates you made, since it will throw a file not found error. So you may have to create the learner and retrain if you want to keep editing that particular class

thanks. i did the same thing as well and it worked fine on Colab

Two + years later and here I am, with the same issue. I tried uninstalling and reinstalling ipywidgets, juypterlab_widgets, restarting and rerunning my kernel too many times. I tried installing new versions of all of it. I finally got to a place where I got a new error and pasted it in ChatGPT. I got this as a response:

You hit a dependency clash:

  • fastbook 0.0.29 pins ipywidgets<8 (it’s from 2022 and hasn’t been updated).
  • But ipywidgets 7.x is not compatible with Python 3.12 (it tries from collections import Mapping, which was removed in 3.10+), hence your new ImportError.

I’m working to find a solution using a different tool on the Pytorch ecosystem.

1 Like

Since this library is somewhat out of date and the book version is from 2020, I’ve been using ChatGPT to help with errors and troubleshooting. (Writing this in September 2025).

That’s where I learned of fiftyone. This is the code GPT spit out to get around the image cleaner issues.

import torch
import numpy as np
import fiftyone as fo

# 1) Get predictions + losses from your learner
probs, targets, losses = learn.get_preds(with_loss=True)
preds = probs.argmax(dim=1)

# 2) Sort by highest loss
topk = 50  # adjust how many you want to see
idxs = torch.topk(losses, topk).indices

# 3) Map indices back to file paths
fns = np.array(learn.dls.valid_ds.items)[idxs.cpu().numpy()]
true_lbls = targets[idxs].cpu().numpy()
pred_lbls = preds[idxs].cpu().numpy()

# 4) Build a FiftyOne dataset from just these samples
samples = []
for fn, tl, pl in zip(fns, true_lbls, pred_lbls):
    samples.append(
        fo.Sample(
            filepath=str(fn),
            ground_truth=fo.Classification(label=learn.dls.vocab[tl]),
            prediction=fo.Classification(label=learn.dls.vocab[pl])
        )
    )

dataset = fo.Dataset("top_losses", overwrite=True)
dataset.add_samples(samples)

session = fo.launch_app(dataset)
session

And then after cleaning the images in the UI, you can run this code to remove the samples marked for deletion:

from pathlib import Path
import shutil

# Either use fastai's dataset root:
# root = Path(learn.dls.path)
# Or hardcode it:
root = Path(<root of path>)

# Optional safety: move deletions into a trash folder instead of permanently unlinking
trash_dir = root / "_trash"
trash_dir.mkdir(parents=True, exist_ok=True)

delete_count = 0

for sample in dataset:   # 'dataset' is your FiftyOne dataset
    if "delete" in (sample.tags or []):
        src = Path(sample.filepath)
        if src.exists():
            # move to trash instead of permanent delete
            dst = trash_dir / src.name
            i = 1
            while dst.exists():
                dst = trash_dir / f"{src.stem}_{i}{src.suffix}"
                i += 1
            shutil.move(str(src), dst)
            delete_count += 1

print(f"Moved {delete_count} images to {trash_dir}")

Now your “delete” images are no longer in the training folders. When you reload with:

dls = ImageDataLoaders.from_folder(root, valid_pct=0.2, seed=42, item_tfms=Resize(224))

And then optionally rebuild the dataset from disk to see your changes in the fiftyone UI.

import fiftyone as fo
clean_root = <dataset root>

dataset.delete()  # if you want to replace it
dataset = fo.Dataset.from_dir(
    dataset_dir=clean_root,
    dataset_type=fo.types.ImageClassificationDirectoryTree,
)
session.dataset = dataset

And then resume running learn.export() as normal in the next cell.