Currently making my way through my copy of James Gleick’s “The Information: A History, a Theory, a Flood”. In chapter 9 (Entropy and its Demons) there is a brief explanatory discussion of a distinction between the use of ‘information’ in Claude Shannon’s sense and the use of ‘information’ in Norbert Wiener’s sense. This is a distinction that I had to ascertain myself when looking at Wiener’s cybernetics for a reading group I was participating in. As Gleick writes:
… a particular message reduces the entropy in the ensemble of possible message – in terms of dynamical systems, a phase space.
That was how Shannon saw it. Wiener’s version was slightly different. It was fitting – for a word that began by meaning the opposite of itself – that these colleagues and rivals placed opposite signs on their formulation of entropy. Where Shannon identified information with entropy, Wiener said it was negative entropy. Wiener was saying that information meant order, but an orderly thing does not necessarily embody much information. Shannon himself pointed out their difference and minimized it, calling it a sort of “mathematical pun.” They get the same numerical answers, he noted:I consider how much information is produced when a choice is made from a set – the larger the set the more information. You consider the larger uncertainty in the case of a larger set to mean less knowledge of the situation and hence less information.
So there is ‘information’ as a measure of entropy and there is ‘information’ in the sense of order. Given the idea of entropy as the extent to which a system is disorganized, the former conception of information (entropy) is inversely related to the latter conception of information (negentropy).