I found myself going back and trying to find where things were so I created timeline from the point where I realized I needed one. I have videos downloaded so these are not links to youtube but timings.
Video 2
0:39:50 explanation of decomposition diagram in terms of document-words
0:40:35 NMF start
0:43:00 Applications of NMF
0:46:08 NMF in sklearn
0:48:00 TF-IDF
0:49:45 NMF in Summary
0:50:45 NMF from scratch in numpy, using SGD (gradient descent notebook walthrough)
0:56:50 Stochastic gradient descend
0:57:55 SGD Excel spreadsheet
1:02:00 Applying SGD to NMF
1:12:10 PyTorch
1:23:00 PyTorch: Autograd
1:35:00 Truncated SVD
1:36:15 Shortcomings of classical algorithms for decomposition
Video 3
0:00:30 Review matrix vector product
0:02:15 Review matrix matrix product
0:05:03 Jeremy Linear combinations data science perspective
0:06:30 Matrix multiplication
0:07:15 Four considerations for algorithms
0:09:50 Considerations parallelization
0:17:30 Return to NMF, SVD
0:18:45 Count matrix, TF-IDF excel
0:21:18 SVD
0:31:00 Block matrix
0:36:32 Perspective of SVD within excel
0:37:00 NMF
0:48:30 Review of notebook
0:50:45 Tweet Numpy
0:52:00 PyTorch revisit
0:55:15 Pytorch basics
1:00:15 Comparing Approaches in notebook 2
1:05:00 Randomized SVD
1:11:18 3 Blue 1 Brown video - Video 3 (Chapter 2)
1:21:33 End of 3B1B Video
1:22:40 Return to notebook
Video 4
0:00:15 What SVD is
0:01:07 Randomized SVD
0:03:45 Complexity of SVD
0:09:40 Why doing randomized SVD is ok
0:12:40 Implementing Randomized SVD
0:15:41 Randomized SVD exercise loop
0:23:20 johnson lindenstrauss lemma
0:24:42 Notebook 3 start
0:28:30 Picture of whole video
0:33:00 SVD
0:49:10 Background removal on Rank 1 matrix
0:54:00 PCA
0:57:25 Applications of Robust PCA
1:02:00 L1 induces sparsity
1:08:10 Robust PCA as Optimization problem
1:10:20 Implementing an algorithm from a paper
1:15:00 Details of algorithm