Information Theory
Information Theory. Instructor: Prof. Himanshu Tyagi, Department of Electrical Engineering, IISc Bangalore. This is a graduate level introductory course in Information Theory where we will introduce the mathematical notion of information and justify it by various operational meanings. This basic theory builds on probability theory and allows us to quantitatively measure the uncertainty and randomness in a random variable as well as information revealed on observing its value. We will encounter quantities such as entropy, mutual information, total variation distance, and KL divergence and explain how they play a role in important problems in communication, statistics, and computer science. Information theory was originally invented as a mathematical theory of communication, but has since found applications in many areas ranging from physics to biology. In fact, any field where people want to evaluate how much information about an unknown is revealed by a particular experiment, information theory can help. In this course, we will lay down the foundations of this fundamental field. (from nptel.ac.in)
Lecture 31 - Variational Formulae |
Go to the Course Home or watch other lectures: