18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning
18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning (Spring 2018, MIT OCW). Instructor: Prof. Gilbert Strang. Linear algebra concepts are key for understanding and creating machine learning algorithms, especially as applied to deep learning and neural networks. This course reviews linear algebra with applications to probability and statistics and optimization-and above all a full explanation of deep learning. (from ocw.mit.edu)
Lecture 15 - Matrices A(t) Depending on t, Derivative = dA/dt |
This lecture is about changes in eigenvalues and changes in singular values. When matrices move, their inverses, their eigenvalues, and their singular values change. Professor Strang explores the resulting formulas.
Go to the Course Home or watch other lectures: