ImageCleaner hangs for large data sets

I’m trying to use ImageCleaner on my data set like so:

ds, idxs = DatasetFormatter().from_toplosses(learn, ds_type=DatasetType.Valid)
ImageCleaner(ds, idxs, './data', batch_size=5)

But the kernel just hangs on the last command and doesn’t produce the widget. I’ve reduced the size of my data set to around 100 images and the cleaner worked no problem. However scaling it back up to my full dataset (13 classes, ~20,000 images) the cleaner hangs and does not produce the widget. Any idea what is causing the issue here?

Having the same issue. My validation set is only about 800 images with 2 classes. Seems like some sort of out of memory error when instantiating the ImageCleaner object? I get a Jupyter Notebook kernel crash when running code identical to the last line above after a hang. The first line will eventually execute, but takes a good while.