Entropy, Negentropy, and Information

Currently making my way through my copy of James Gleick’s “The Information: A History, a Theory, a Flood”. In chapter 9 (Entropy and its Demons) there is a brief explanatory discussion of a distinction between the use of ‘information’ in Claude Shannon’s sense and the use of ‘information’ in Norbert Wiener’s sense. This is a distinction that I had to ascertain myself when looking at Wiener’s cybernetics for a reading group I was participating in. As Gleick writes:

… a particular message reduces the entropy in the ensemble of possible message – in terms of dynamical systems, a phase space.
That was how Shannon saw it. Wiener’s version was slightly different. It was fitting – for a word that began by meaning the opposite of itself – that these colleagues and rivals placed opposite signs on their formulation of entropy. Where Shannon identified information with entropy, Wiener said it was negative entropy. Wiener was saying that information meant order, but an orderly thing does not necessarily embody much information. Shannon himself pointed out their difference and minimized it, calling it a sort of “mathematical pun.” They get the same numerical answers, he noted:

I consider how much information is produced when a choice is made from a set – the larger the set the more information. You consider the larger uncertainty in the case of a larger set to mean less knowledge of the situation and hence less information.

So there is ‘information’ as a measure of entropy and there is ‘information’ in the sense of order. Given the idea of entropy as the extent to which a system is disorganized, the former conception of information (entropy) is inversely related to the latter conception of information (negentropy).

Thesis Completion Seminar

Details of my upcoming thesis completion seminar:

Title: A Framework for Semantic Information

Date, Time: 24/7/2012, 5:15pm

Location: Old Quad Common Room, Melbourne University

Abstract: In this talk I present a summary of my thesis research, which develops an account of semantic information. After an introductory overview of information, I shall outline the main points and results of the following chapters:

– Quantifying Semantic Information
– Agent-Relative Informativeness
– Environmental Information and Information Flow
– Information and Knowledge

Problems with An Objective Counterfactual Theory of Information

Jonathan Cohen and Aaron Meskin (C&M) published a paper several years ago titled `An Objective Counterfactual Theory of Information’. Here is its abstract

Philosophers have appealed to information (as understood by [Shannon, 1948] and introduced to philosophers largely by [Dretske, 1981]) in a wide variety of contexts; information has been proffered in the service of understanding knowledge, justification, and mental content, inter alia. While information has been put to diverse philosophical uses, there has been much less diversity in the understanding of information itself. In this paper we’ll offer a novel theory of information that differs from traditional accounts in two main (and orthogonal) respects: (i) it explains information in terms of counterfactuals rather than conditional probabilities, and (ii) it does not make essential reference to doxastic states of subjects, and consequently allows for the sort of objective, reductive explanations of notions in epistemology and philosophy of mind that many have wanted from an account of information.

We’ll first present our counterfactual account of information (1), and show how it sidesteps a problem that has been raised for its traditional, probabilistic competitors (2). Next we’ll compare the counterfactual account against that proposed by Dretske (3), highlighting the differences between the two. After that, we’ll turn to questions about objectivity: we’ll bring out a conflict between the essentially doxastic character of traditional theories of information and the reductive purposes philosophers have had in mind in appealing to information (4), and we’ll show how the account of 1 can be formulated in non-doxastic terms. Finally, we’ll consider objections against the proposed account (5). Ultimately, we’ll suggest, the objective counterfactual account of information should be taken as a serious contender to more traditional rivals.

The central definition of information that they provide is:


(S*) … x‘s being F carries information about y‘s being G if and only if the counterfactual conditional “if y were not G, then x would not have been F” is non-vacuously true.

Also, in a footnote C&M mention that x‘s being F and y‘s being G are construed as actual events, so one event carries information about a second only if they are actual.

As outlined in this document, some exploration reveals that using the standard logic of counterfactuals, C&M’s definition gives some results that disagree with what it seems are fairly straightforward properties of information flow.

Symposium: Luciano Floridi, The Philosophy of Information

Etica & Politica

Gustavo Cevolani
Strongly semantic information and verisimilitude

Massimo Durante
Normativity, Constructionism, and Constraining Affordances

Don Fallis
Floridi on Disinformation

David Gamez
Information and Consciousness

Jakob Krebs
Philosophy of Information and Pragmatistic Understanding of Information

Marty J. Wolf
Analysis, Clarification and Extension of the Theory of Strongly Semantic Information

Anthony F. Beavers
Historicizing Floridi