18.S096 Matrix Calculus for Machine Learning and Beyond
18.S096 Matrix Calculus for Machine Learning and Beyond (IAP 2023, MIT OCW). Instructors: Prof. Alan Edelman and Prof. Steven G. Johnson. This class covers a coherent approach to matrix calculus showing techniques that allow you to think of a matrix holistically (not just as an array of scalars), generalize and compute derivatives of important matrix factorizations and many other complicated-looking operations, and understand how differentiation formulas must be reimagined in large-scale computing. We will discuss reverse/adjoint/backpropagation differentiation, custom vector-Jacobian products, and how modern automatic differentiation is more computer science than calculus (it is neither symbolic formulas nor finite differences. (from ocw.mit.edu)
Lecture 05 - Part2: Forward Automatic Differentiation via Dual Numbers |
Instructors: Prof. Alan Edelman and Prof. Steven G. Johnson. One simple way to automatically differentiate (AD) computer programs (in "forward mode") is to define a new kind of number that carries both value and derivative information: dual numbers. A few lines of Julia let us build an AD system!
Go to the Course Home or watch other lectures: