muchen 牧辰

Tutorial 9

Updated 2017-12-11

The entropy of the uniform distribution is largest. The entropy of a uniform distribution is given as:

\[H(U)=-\sum_{i=1}^M\frac{1}{M}\log_2\frac1M=-\log_2\frac1M=\log_2M\]

To show that the entropy of the uniform distribution is greater than the entropy of any other function, consider \(\log_2 M\geq 0\)

\[\log_2M-H(X)\geq0\\ \log_2M-H(X)=\log_2 M+\sum_{i=1}^Mp_i\log_2 (p_i)\\ =\log_2M\sum_{i=1}^Mp_i+\sum_{i=1}^M p_i\log_2 p_i\\ =\sum_{i=1}^M p_i (\log_2 M+\log_2p_i)\\ =\sum_{i=1}^Mp_i\log_2Mp_i\\ =\sum_{i=1}^M p_i\frac{\ln (Mp_i)}{\ln2}=\frac{1}{\ln2}\sum_{i=1}^Mp_i\ln(Mp_i)\\ \geq\frac1{\ln2}\sum_{i=1}^Mp_i(1-\frac{1}{Mp_i})=\frac{1}{\ln2}\left(\sum p_i-\sum \frac{1}{M}\right)=0\]

Source Coding Theorem