An Introduction to Information Theory
An Introduction to Information Theory. Instructor: Prof. Adrish Banerjee, Department of Electrical Engineering, IIT Kanpur. Information Theory answers two fundamental questions: what is the maximum data rate at which we can transmit over a communication link, and what is the fundamental limit of data compression. In this course we will explore answers to these two questions. We will study some practical source compression algorithms. We will also study how to compute channel capacity of simple channels. (from nptel.ac.in)
Lecture 18 - Differential Entropy |
In this lecture, we first define entropy for continuous random variable (differential entropy). Then we define properties of differential entropy, such as asymptotic equipartition properties, entropy of n-bit quantization of continuous random variable, effect of translation and scaling on differential entropy. We also define joint differential entropy, conditional differential entropy, mutual information as well as divergence, and chain rule for differential entropy.
Go to the Course Home or watch other lectures: