18.S096 Matrix Calculus for Machine Learning and Beyond
18.S096 Matrix Calculus for Machine Learning and Beyond (IAP 2023, MIT OCW). Instructors: Prof. Alan Edelman and Prof. Steven G. Johnson. This class covers a coherent approach to matrix calculus showing techniques that allow you to think of a matrix holistically (not just as an array of scalars), generalize and compute derivatives of important matrix factorizations and many other complicated-looking operations, and understand how differentiation formulas must be reimagined in large-scale computing. We will discuss reverse/adjoint/backpropagation differentiation, custom vector-Jacobian products, and how modern automatic differentiation is more computer science than calculus (it is neither symbolic formulas nor finite differences. (from ocw.mit.edu)
Lecture 08 - Part2: Automatic Differentiation on Computational Graphs |
Instructors: Prof. Alan Edelman and Prof. Steven G. Johnson. Complicated computational processes can be expressed as "graphs" of computational steps that flow from inputs to outputs. Forward- and reverse-mode automatic differentiation (AD) traverse in opposite directions, giving very different algorithms.
Go to the Course Home or watch other lectures: