Computational Linear Algebra
About the Computational Linear Algebra category
(6)
Gradient Formula used in NMF
(1)
Word Embedings
(2)
Doubt on Floating Point Arithematic in Lesson 1
(1)
The data set for lesson 2 cannot be downloaded
(3)
Compressed Sensing
(3)
[Video 4] What do the diagonals on the graphs of L1 and L2 norms represent?
(9)
Lectures for computational linalg v2
(1)
Singular Value Decomposition - Blog
(4)
CP Decomposition
(1)
Numpy replace specific rows and columns of one array with specific rows and columns of another array
(3)
Another set of lecture notes
(3)
Linear Algebra only for NLP?
(2)
python code for Video 2 lectures
(1)
Any suggestions for understanding orthonormality, orthogonality?
(5)
What is pool8?
(1)
Summarizing the Essence of linear algebra Video series
(1)
Back Propagation Math Simplified
(3)
Custom dot product
(1)
SVD Sign Ambiguity (for PCA "determinism")
(1)
Here's all the Matrix Calculus You Need For Deep Learning
(2)
Timeline for Videos
(2)
My take on what I got from Videos and how I am managing my study
(5)
[Video 2] "MemoryError:" when doing the "Confirm that U, Vh are orthonormal" exercise
(2)
How much about the implementation of randomized SVD implementation in Notebook 2 should have been understood when Rachael moves on to Notebook 3?
(1)
Video 4 - How does taking powers of A help us get a better approximation?
(1)
Any suggestions for brushing up on Calculus?
(14)
Are all matrix decomposition slow or is it just SVD?
(2)
Minimizing forbenius norm is basically trying to make all elements of matrix as small as possible
(2)
What does the plot of components represent in video 2/notebook 2?
(2)
next page →