18.S096 Matrix Calculus for Machine Learning and Beyond
18.S096 Matrix Calculus for Machine Learning and Beyond (IAP 2023, MIT OCW). Instructors: Prof. Alan Edelman and Prof. Steven G. Johnson. This class covers a coherent approach to matrix calculus showing techniques that allow you to think of a matrix holistically (not just as an array of scalars), generalize and compute derivatives of important matrix factorizations and many other complicated-looking operations, and understand how differentiation formulas must be reimagined in large-scale computing. We will discuss reverse/adjoint/backpropagation differentiation, custom vector-Jacobian products, and how modern automatic differentiation is more computer science than calculus (it is neither symbolic formulas nor finite differences. (from ocw.mit.edu)
Lecture 06 - Part2: Calculus of Variations and Gradients of Functionals |
Instructors: Prof. Alan Edelman and Prof. Steven G. Johnson. A "functional" F[u] takes a function u(x) and returns a number (e.g. by integrating), and since functions are perfectly good normed vector spaces we can define derivatives F' with respect to functions u! This is called "calculus of variations."
Go to the Course Home or watch other lectures: