Hi samir.s.omer Hope your having a marvelous day!
Below are some links that are discussing your issue!
It doesn’t seem an easy thing to resolve.
Let’s say I have trained an image recogniser to classify the content of an image as either a dog, a cat, a rat or an elephant. I put my model into production and it is for some reason fed with images of humans. The probabilities from the model suggests that there is 70% probability that an image of a human is an image of an elephant. I would rather like the model to say the input is not like any of the classes it was trained on.
What is the go-to-way of handling this problem? I don’t have resou…
How do I not predict confidently on an Image which is not from the Classes that I trained the model On?
In the Pets classification from Lesson 1, when I try to Predict using the trained model on a Random object like say a “Building” it still predicts confidently as One of the “Dog breeds” from the Classes of the pet dataset http://www.robots.ox.ac.uk/~vgg/data/pets/
Do I have to create a comprehensive class with images other than these Dog breeds and call it “other”, and retrain the model?
A…
I also saw this model which I haven’t played with yet but maybe combing your model with something like this may help. (I thought this model was fantastic! you have to watch the video )
muellerzr Zachary Mueller Regular
Dec '19
It was implemented in fastaiv1 here https://github.com/fg91/Neural-Image-Caption-Generation-Tutorial
Hope this helps.
Cheers mrfabulous1
2 Likes