Show Question
Kullback-Leibler divergence and Gibbs' inequality

Relative entropy or Kullback-Leibler divergence from one distribution $$P(x)$$ to another $$Q(x)$$, both defined over the same alphabet $$A_x$$ is:

$D_{KL}(P||Q) = \sum_x P(x) \log \frac{P(x)}{Q(x)}$

The relative entropy satisfies Gibbs' inequality:

$D_{KL}(P||Q) \ge 0$

With equality only if P = Q.

The relationship is not symmetric under interchange of the distributions P and Q, so $$D_{KL}(P||Q) \ne D_{KL}(Q||P)$$, so $$D_{KL}$$ is not strictly a distance, dispited it being sometimes called the 'KL distance'.