I’ve got a question that borders on the philosophical and practical.
Imagine you have a CNN whose job it is to perform a simple binary classification:
It is well trained on a multitude of training data with many varied examples of each, and performs very well on Test datasets that have an airplane or baby in the picture.
Now, I wish to add a third class:
C) Not Airplane nor Baby
This third class should be a catchall for all inputs that are not confidently class A or B. Inference on all pictures that don’t have either airplanes or babies in them should be labeled class C. There is no training data for class C, but rather its likelihood is solely based on the unlikelihood that an object belongs to the other classes.
How would you make this NULL class? Is this easy to do? Is it possible?
My initial thoughts:
It feels like it has the flavor of unsupervised learning, but all of the training data is labeled…
It feels like a thing that humans are very good at–specifically having “confidence that an object doesn’t belong to any known classes”.
Is there a popular term for this type of catchall bucket, so that I can google it?