How does concept of “overfitting” relate to the too many/too few epochs or too high/low learning rate? Seems like signals for “overfitting” matched “too high learning rate”
Is there a way to generalize training so that over-fitting issue is reduced? A trained model that will be generic enough to handle new data without much issue?
Can we have a definition of the error rate being discussed and how it is calculated? I assume it’s cross validation error defined by predicted_right / all_predictions
Jeremy mentioned that training error should always be lower than validation error. What happen when using dropout, which is applied only to training, increasing the training loss?
where is this slide jeremy is showing now?
Even with dropout, by the time the model is fitting well (not underfitting), your training error will likely be below your validation error. At least that’s been my experience
@rachel Ppl in previous thread were asking why ConvLearner name was changed to create_cnn.
I do not know if it is true in Fastai, but in Keras, if you have dropout layer, your training loss can be higher than validation loss, since dropout is not applied during validation.
Looking at the Teddy, Grizzly, Black Bear classification problem, what can I do If I have pictures with another animal, e.g. a Zebra, but no Zebra training data? Is there an option to say “other” in general in classification?
Rachel I can confirm this one had a lot of likes.
That pixel-to-numbers graphic is from this article: https://medium.com/@ageitgey/machine-learning-is-fun-part-3-deep-learning-and-convolutional-neural-networks-f40359318721
It’s part of a series that looks really good. The author has more great looking deep learning articles here: https://medium.com/@ageitgey
Hey I have upgraded using commands in https://forums.fast.ai/t/faq-resources-and-official-course-updates/27934 that and it is showing
‘Name: fastai
Version: 1.0.18’
But still getting not found error for download_images . I am on salamander.ai
can anyone help with this error “NameError: name ‘download_images’ is not defined”
is it always fine to just randomly split validation set and training set?
I am getting error even after successful upgrade
check your fastai.version
from my experience is no. Since dropout is not applied during validation, at least on Keras.
Not in my experience on tabular data
@Rachel There are a few likes regarding the mysterious 3s in the end points on the range of values passed in to the learning rate finder:
import fastai
print(fastai.version)
1.0.15