Mutual Information I(X ; Y) Properties

Property 1:- Mutual Information is Non-Negative

Mutual Information is given by equation I(X ; Y) =\sum_{i=1}^{m}\sum_{j=1}^{n}P(x_{i}, y_{j})\log _{2} \frac{P(\frac{x_{i}}{y_{j}})}{P(x_{i})}---------Equation(I)

we know that P(\frac{x_{i}}{y_{j}})=\frac{P(x_{i}, y_{j})}{P(y_{j})}-------Equation(II)

Substitute Equation (II) in Equation (I)

I(X ; Y) =\sum_{i=1}^{m}\sum_{j=1}^{n}P(x_{i}, y_{j})\log _{2}\frac{P(x_{i}, y_{j})}{P(x_{i})P(y_{j})}

The above Equation can be written as 

I(X ; Y) =-\sum_{i=1}^{m}\sum_{j=1}^{n}P(x_{i}, y_{j})\log _{2}\frac{P(x_{i})P(y_{j})}{P(x_{i}, y_{j})}

-I(X ; Y) =\sum_{i=1}^{m}\sum_{j=1}^{n}P(x_{i}, y_{j})\log _{2}\frac{P(x_{i})P(y_{j})}{P(x_{i}, y_{j})}------Equation(III)

we knew that \sum_{k=1}^{m} p_{k}\log _{2}(\frac{q_{k}}{p_{k}})\leq 0---Equation(IV)

This result can be applied to Mutual Information I(X ; Y) , If p_{k} = P(x_{i}, y_{j}) and q_{k} be P(x_{i}) P( y_{j}), Both p_{k} and q_{k} are two probability distributions on same alphabet , then Equation (III) becomes 

-I(X ; Y) \leq 0

i.e, I(X ; Y) \geq 0  , Which implies that Mutual Information is always Non-negative (Positive).

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)

Loading...

Author: Lakshmi Prasanna

Completed M.Tech in Digital Electronics and Communication Systems and currently working as a faculty.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.