Another great session with @willsa14. In last Tuesday study group we got into the weeds of the coding in lecture 4 represented in 04_mnist_basics.ipynb Jupyter notebook.
This was all about dissecting each component of the code to understand what it meant. We also are beginning to appreciate the algorithm behind ML while discussing some key fundamental concepts covered during the last 15 minutes in lecture 3.
Our goal for the next study group will be to truly understand the difference between the accuracy (as know as performance) and the learning rate.
Another great study group session with @willsa14. We now have both feet into Lecture 4 and are continuing to go through key concepts like.
What role does the accuracy play vs. the learning rate?
2)Does the learning rate ever change?
3)would you consider one batch, if large enough, a little ML which results are leveraged with subsequent batch results?
if so, how does that work?
and many more wonderings we are musing about in our study group!
One key concept we are able to fully grasp, thanks to @Kerner, is the mechanics of how the mean is actually calculated on a 3 rank tensor.
you can find that answer at this link.
Next week we will continue to go through the concepts covered in lecture 4.
A Great study group on the first half of lecture 4.
We clarified the importance of zeroing out the require_grad function and the rationale behind it. We also went through the answers to the lecture 4 questions from this thread.
Hi, this is Mukesh from India nice to join the community. Next Study group meeting which is April 12. I think it april 13, 5.30 am to 6.30 am correct me if i am wrong. Looking forward to learn…Thanks