Entropy and Mutual Information
1. $\textbf{Entropy}$ definition: Let $X$ be a continuous random variable, defined on probability space $(\Omega,\mathcal{F},\mathcal{P})$ and $X\in\mathcal{X}\subseteq\mathbb{R}$, with cumulative distribution function (CDF) given by \[F(x)=\mathcal{P}\{X\leq x\},\] and probability density function (pdf) given by \[f(x)=\dfrac{dF(x)}{dx},\] which are both assumed to be continuous functions. Then, the entropy of a continuous random variable $X$ is defined as \[h(X)=-\int_\mathcal{X} f(x)\log f(x)dx,\] where the integration is carried out on the support of the random variable. Example 1.1 Entropy: Entropy of a normal distribution $X\sim\mathcal{N}(0,\sigma^2)$, $f(x)=\dfrac{1}{\sqrt{2\pi\sigma^2}}\exp{\left(-\dfrac{x^2}{2\sigma^2}\right)},$ is \[\begin{align*} h(X)&=-\int f(x)\log f(x)dx\\ &=-\int f(x)\left[-\dfrac{1}{2}\log{\left(2\pi\sigma^2\right)}-\dfrac{x^2}{2\sigma^2}\log e\right]dx\\ &=\dfrac{1}{2}\log{\left(2\pi\sigma^2\right)}\int f(x)dx+\dfrac{\log e}{2\sigma...