18.S096 Matrix Calculus for Machine Learning and Beyond
18.S096 Matrix Calculus for Machine Learning and Beyond (IAP 2023, MIT OCW). Instructors: Prof. Alan Edelman and Prof. Steven G. Johnson. This class covers a coherent approach to matrix calculus showing techniques that allow you to think of a matrix holistically (not just as an array of scalars), generalize and compute derivatives of important matrix factorizations and many other complicated-looking operations, and understand how differentiation formulas must be reimagined in large-scale computing. We will discuss reverse/adjoint/backpropagation differentiation, custom vector-Jacobian products, and how modern automatic differentiation is more computer science than calculus (it is neither symbolic formulas nor finite differences. (from ocw.mit.edu)
Lecture 06 - Part1: Adjoint Differentiation of ODE Solutions |
Instructors: Prof. Alan Edelman and Prof. Steven G. Johnson. Many systems are modeled by ordinary differential equations (ODEs), and often you want the derivative of the ODE solution with respect to parameters of the equations. An efficient way to do this is often to solve a second "adjoint" ODE.
Go to the Course Home or watch other lectures: