# Mutual Information I(X ; Y) Properties

Property 1:- Mutual Information is Non-Negative

Mutual Information is given by equation $I(X&space;;&space;Y)&space;=\sum_{i=1}^{m}\sum_{j=1}^{n}P(x_{i},&space;y_{j})\log&space;_{2}&space;\frac{P(\frac{x_{i}}{y_{j}})}{P(x_{i})}---------Equation(I)$

we know that $P(\frac{x_{i}}{y_{j}})=\frac{P(x_{i},&space;y_{j})}{P(y_{j})}-------Equation(II)$

Substitute Equation (II) in Equation (I)

$I(X&space;;&space;Y)&space;=\sum_{i=1}^{m}\sum_{j=1}^{n}P(x_{i},&space;y_{j})\log&space;_{2}\frac{P(x_{i},&space;y_{j})}{P(x_{i})P(y_{j})}$

The above Equation can be written as

$I(X&space;;&space;Y)&space;=-\sum_{i=1}^{m}\sum_{j=1}^{n}P(x_{i},&space;y_{j})\log&space;_{2}\frac{P(x_{i})P(y_{j})}{P(x_{i},&space;y_{j})}$

$-I(X&space;;&space;Y)&space;=\sum_{i=1}^{m}\sum_{j=1}^{n}P(x_{i},&space;y_{j})\log&space;_{2}\frac{P(x_{i})P(y_{j})}{P(x_{i},&space;y_{j})}------Equation(III)$

we knew that $\sum_{k=1}^{m}&space;p_{k}\log&space;_{2}(\frac{q_{k}}{p_{k}})\leq&space;0---Equation(IV)$

This result can be applied to Mutual Information $I(X&space;;&space;Y)$ , If $p_{k}&space;=&space;P(x_{i},&space;y_{j})$ and $q_{k}$ be $P(x_{i})&space;P(&space;y_{j})$, Both $p_{k}$ and $q_{k}$ are two probability distributions on same alphabet , then Equation (III) becomes

$-I(X&space;;&space;Y)&space;\leq&space;0$

i.e, $I(X&space;;&space;Y)&space;\geq&space;0$  , Which implies that Mutual Information is always Non-negative (Positive).

(No Ratings Yet)