\( \newcommand{\matr}[1] {\mathbf{#1}} \newcommand{\vertbar} {\rule[-1ex]{0.5pt}{2.5ex}} \newcommand{\horzbar} {\rule[.5ex]{2.5ex}{0.5pt}} \)
deepdream of
          a sidewalk
Show Question
Kullback-Leibler divergence and Gibbs' inequality

Relative entropy or Kullback-Leibler divergence from one distribution \( P(x) \) to another \( Q(x) \), both defined over the same alphabet \( A_x \) is:


\[ D_{KL}(P||Q) = \sum_x P(x) \log \frac{P(x)}{Q(x)} \]

The relative entropy satisfies Gibbs' inequality:

\[ D_{KL}(P||Q) \ge 0 \]

With equality only if P = Q. 

The relationship is not symmetric under interchange of the distributions P and Q, so \( D_{KL}(P||Q) \ne D_{KL}(Q||P) \), so \( D_{KL} \) is not strictly a distance, dispited it being sometimes called the 'KL distance'.