The Kullback–Leibler (KL) divergence is a non-symmetric measure of the difference between two probability distributions P and Q:
KL divergence:
In this piece [page 133] a neat example is given, showing how this measure can be used to quantify the `truthlikeness’ of a probability measure.