Paper award 36864
Paper award 36863
Paper award 36843
Paper award 36842
Basic Notions
- Entropy
- Differential entropy
- Graph entropy
- Conditional entropy
- Mutual infor
Entropy
Definitions
°À°ÚÌý±á(³Ý)=°À±ô´Ç²µ°À´Ú°ù²¹³¦µ÷1°¨µ÷±è³å³Ý(³Ý)°¨.°À±ÕÌý
The base of the logarithm defines the unit of entropy. If the logarithm is to the base 2, the unit of entropy is the bit. If the if the logarithm is to the base \(e\), the unit of entropy is the nat.
Mutual Information
Definitions
Let \(X\) and \(Y\) be discrete random variables defined on finite alphabets \(\mathcal{X}\) \(\mathcal{Y}\), respectively, and with joint probability mass function \(p_{X,Y}\). The mutual information of \(X\) and \(Y\) is the random variable \(I(X,Y)\) defined by
\[ I(X,Y) = \log\frac{p_{X,Y}(X,Y)}{p_X(X)p_Y(Y)}.\]
As with entropy, the base of the logarithm defines the units of mutual information. Â If the if the logarithm is to the base \(e\), the unit of entropy is the nat.