The Kullback–Leibler (KL) divergence is a non-symmetric measure of the difference between two probability distributions P and Q:
KL divergence:
![D_{KL}(P || Q) = \Sigma_{i}P(O_{i}) log \frac{P(O_{i}}{P(Q_{i})} D_{KL}(P || Q) = \Sigma_{i}P(O_{i}) log \frac{P(O_{i}}{P(Q_{i})}](https://theinformationalturn.net/wp-content/plugins/easy-latex/cache/tex_6af61038ddcaf81f250d489722a642a5.png)
In this piece [page 133] a neat example is given, showing how this measure can be used to quantify the `truthlikeness’ of a probability measure.