The Difference That Makes a Difference series of workshops is a forum for interdisciplinary sharing of insights on the nature of information.
CFP 5th Workshop on Philosophy of Information
27th-28th March 2013
University of Hertfordshire, UK
Submissions are invited for the Fifth Workshop on the Philosophy of Information, which will take place at the University of Hertfordshire, 27th-28th March 2013
The topic this year will be the intersections between qualitative and quantitative views of information.
There is no registration fee, and no fee for the refreshments, lunches, and the workshop dinner.
Bursaries that will cover the participation expenses will be awarded on the basis of need and scientific merit.
Please send abstracts of approximately 1000 words to Mrs Penny Driscoll, <email@example.com>, no later than 1 February 2013.
A selection of the best papers will be submitted for publication in a peer-reviewed journal, tba. Papers from the 4th workshop are forthcoming in Minds and Machines.
The Workshop is organised by the UNESCO Chair in Information and Computer Ethics
http://www.philosophyofinformation.net, in collaboration with the AHRC project ‘Understanding Information Quality Standards and their Challenges’ (2011-2013)
For more information about format and previous participants, see previous workshops in the series:
The Stanford Encyclopedia of Philosophy has just published an entry on information: http://plato.stanford.edu/entries/information/
Currently making my way through my copy of James Gleick’s “The Information: A History, a Theory, a Flood”. In chapter 9 (Entropy and its Demons) there is a brief explanatory discussion of a distinction between the use of ‘information’ in Claude Shannon’s sense and the use of ‘information’ in Norbert Wiener’s sense. This is a distinction that I had to ascertain myself when looking at Wiener’s cybernetics for a reading group I was participating in. As Gleick writes:
… a particular message reduces the entropy in the ensemble of possible message – in terms of dynamical systems, a phase space.
That was how Shannon saw it. Wiener’s version was slightly different. It was fitting – for a word that began by meaning the opposite of itself – that these colleagues and rivals placed opposite signs on their formulation of entropy. Where Shannon identified information with entropy, Wiener said it was negative entropy. Wiener was saying that information meant order, but an orderly thing does not necessarily embody much information. Shannon himself pointed out their difference and minimized it, calling it a sort of “mathematical pun.” They get the same numerical answers, he noted:
I consider how much information is produced when a choice is made from a set – the larger the set the more information. You consider the larger uncertainty in the case of a larger set to mean less knowledge of the situation and hence less information.
So there is ‘information’ as a measure of entropy and there is ‘information’ in the sense of order. Given the idea of entropy as the extent to which a system is disorganized, the former conception of information (entropy) is inversely related to the latter conception of information (negentropy).
Much to my contentment I have finished my thesis and submitted it for examination.
I have just revised my CV: view CV.
Interesting fact about the word ‘information’. According to this list, ‘information’ is the 315th most used word in the 450 million word Corpus of Contemporary American English, appearing 127331 times.
Details of my upcoming thesis completion seminar:
Title: A Framework for Semantic Information
Date, Time: 24/7/2012, 5:15pm
Location: Old Quad Common Room, Melbourne University
Abstract: In this talk I present a summary of my thesis research, which develops an account of semantic information. After an introductory overview of information, I shall outline the main points and results of the following chapters:
- Quantifying Semantic Information
- Agent-Relative Informativeness
- Environmental Information and Information Flow
- Information and Knowledge
Jonathan Cohen and Aaron Meskin (C&M) published a paper several years ago titled `An Objective Counterfactual Theory of Information’. Here is its abstract
Philosophers have appealed to information (as understood by [Shannon, 1948] and introduced to philosophers largely by [Dretske, 1981]) in a wide variety of contexts; information has been proffered in the service of understanding knowledge, justification, and mental content, inter alia. While information has been put to diverse philosophical uses, there has been much less diversity in the understanding of information itself. In this paper we’ll offer a novel theory of information that differs from traditional accounts in two main (and orthogonal) respects: (i) it explains information in terms of counterfactuals rather than conditional probabilities, and (ii) it does not make essential reference to doxastic states of subjects, and consequently allows for the sort of objective, reductive explanations of notions in epistemology and philosophy of mind that many have wanted from an account of information.
We’ll first present our counterfactual account of information (1), and show how it sidesteps a problem that has been raised for its traditional, probabilistic competitors (2). Next we’ll compare the counterfactual account against that proposed by Dretske (3), highlighting the differences between the two. After that, we’ll turn to questions about objectivity: we’ll bring out a conflict between the essentially doxastic character of traditional theories of information and the reductive purposes philosophers have had in mind in appealing to information (4), and we’ll show how the account of 1 can be formulated in non-doxastic terms. Finally, we’ll consider objections against the proposed account (5). Ultimately, we’ll suggest, the objective counterfactual account of information should be taken as a serious contender to more traditional rivals.
The central definition of information that they provide is:
(S*) … x’s being F carries information about y’s being G if and only if the counterfactual conditional “if y were not G, then x would not have been F” is non-vacuously true.
Also, in a footnote C&M mention that x’s being F and y’s being G are construed as actual events, so one event carries information about a second only if they are actual.
As outlined in this document, some exploration reveals that using the standard logic of counterfactuals, C&M’s definition gives some results that disagree with what it seems are fairly straightforward properties of information flow.
What is a suitable name for the third property here?