Entropy and information divergence

The KullbackÔÇôLeibler (KL) divergence is a non-symmetric measure of the difference between two probability distributions P and Q:

KL divergence: D_{KL}(P || Q) = \Sigma_{i}P(O_{i}) log \frac{P(O_{i}}{P(Q_{i})}

In this piece [page 133] a neat example is given, showing how this measure can be used to quantify the `truthlikeness’ of a probability measure.

Leave a Reply