I started working through the course Having experienced the fastai way of learning, I know both rephrasing what you learn and spaced repetition are important. So I decided to annex this piece of forums and for each lecture that I watch will come here and write a short summary.
I also know that working on things on your computer can get quite lonely and sharing it with others is infinitely more fun.
Hope you do not mind me using this tiny corner of the forums like that. I will be editing this post with my musings but feel free to stop by and post whatever you’d like.
Lecture 1
- Resources I didn’t know about:
- It can be useful (both for learning and in general) to look at linear algebra through the perspective of what it can do vs the technicalities of how it does its magic
- Two examples of applications of matrix vector multiplication:
- calculating state transitions based on transition probabilities
- estimating total cost via multiplying demanded goods times price list
- Challenges of performing computations on a computer:
- math is continuous & infinite, but computers are discrete & finite
- especially with iterative algorithms, deltas between ‘actual’ values and represented inside the computer can accumulate quickly
- there is always a trade off between speed, accuracy, memory usage, etc and it is important to understand how an algorithm performs against each of those dimensions
- for practical applications understanding error bounds can be very important
- numerical imprecision stemming from discreetness as regularization for deep learning - a very interesting way of looking at things
- locality of calculations - memory access times increase with the size of the memory, we want to ideally perform as much calculation as possible on the chunk of data we have close to the CPU, this poses interesting challenges for structuring calculations
- math is continuous & infinite, but computers are discrete & finite