Archive for September, 2011

Entropy and information divergence

Thursday, September 29th, 2011

The KullbackÔÇôLeibler (KL) divergence is a non-symmetric measure of the difference between two probability distributions P and Q:


KL divergence: D_{KL}(P || Q) = \Sigma_{i}P(O_{i}) log \frac{P(O_{i}}{P(Q_{i})}

In this piece [page 133] a neat example is given, showing how this measure can be used to quantify the `truthlikeness’ of a probability measure.

Information Flow: Logics for the (r)age of information

Thursday, September 22nd, 2011

http://www.iam.unibe.ch/til/publications/pubitems/kel02

Information is what information does

Wednesday, September 14th, 2011

Information is what information does