The mutual information between two discrete random variables and is defined to be
(1)
|
bits. Additional properties are
(2)
| |||
(3)
|
and
(4)
|
where is the entropy of the random variable and is the joint entropy of these variables.
The mutual information between two discrete random variables and is defined to be
(1)
|
bits. Additional properties are
(2)
| |||
(3)
|
and
(4)
|
where is the entropy of the random variable and is the joint entropy of these variables.
This entry contributed by Erik G. Miller
Miller, Erik G. "Mutual Information." From MathWorld--A Wolfram Web Resource, created by Eric W. Weisstein. https://mathworld.wolfram.com/MutualInformation.html