Are all batch grabs within an epoch without replacement.Means in all cases will each batch be absolutely different than the last?
In one epoch yes, no batches are repeated.
In a epoch, generally, each minibatch will have different data points.
Thanks for your reply. Is there any reason why that we know?
Also… I get:
Can't call numpy() on Variable that requires grad. Use var.detach().numpy() instead.
Thanks!
Thank you for the patient answers everyone!
Was there any homework? I missed that part
Rachel’s blog on creating a good validation set: https://www.fast.ai/2017/11/13/validation-sets/
You’re updating your network depending on gradients you compute on your minibatches. Those gradients are supposed to be good approximation of the gradients on the whole dataset. In practice though, each batch isn’t perfect. It might have too many elements of one class, or some mislabelled data. if you go through your batches in order, you will always have the same kind of un-perfectness.
Shuffling every time you go again over your data will unsure something smoother. All you batches will still be unperfect (as a representation of the whole dataset), but the un-perfectness will be different each time, so it will get balanced.
sudo update fast ai? Not exactly reinstall.
Thank you sir!
same here
No any explicit homeworks. May be try creating a production version of the classifier.
Or experiment with the math points mentioned in the lesson.
Thanks, that’s a great explanation!
What is a production version? You mean make your own image classifier from scratch?
I was surprised when Jeremy said you don’t need to balance the imbalance dataset. It is the case only with deep learning right? Not with machine learning
For people trying out making “web apps” as Jeremy encouraged, other than starlette as an asyncio framework, also checkout this “new” modern framework by the guy who also created the requests library and pipenv. It is actually “powered by starlette”:
It wouldn’t make sense to have replacements in an epoch, because you don’t want to train on images already seen, but instead want to get through the epoch as soon as possible.