Grasping Semantic Information

Here is a recent talk given by Floridi on Semantic Information. Following is the talk’s abstract:

Semantic information is a very slippery topic. If we know the relevant codes, we patently have no problems in understanding sentences in natural languages, maps, formulae, road signs or other similar instances of well-formed an meaningful data. And yet, information scientists and philosophers have struggled to determine what exactly semantic information is and what it means for an agent to become informed semantically. In this paper, I shall discuss the “three pillars” of the philosophy of information: Shannon’s communication model, the covariance model and the inverse relationship principle. I shall argue that none of them captures our intuitions about semantic information and “becoming informed”. I shall then outline what I hope might be a more promising strategy.

Natural Information Without Truth

In Information Without Truth, the Veridicality Thesis for both natural (environmental) and non-natural (semantic) information is rejected.

The Veridicality Thesis for Natural Information (VTN) can be stated:


(VTN) If a signal s being F carries natural information about an object o being G, then o is G.

This is like Dretske’s definition of information flow, and it entails that if an agent A has natural information about an object o being G when they receive a signal of s being F, then o is G.

Contrary to this, in Information Without Truth a Probability Raising Thesis for Natural Information (PRTN) is given, where the transmission of natural information involves nothing more than the truth of the following probabilistic claim:


(PRTN) If a signal s being F carries natural information about an object o being G, then P(o is G | s is F) > P(o is G | \neg (s is F))

It should be noted that they footnote this definition with the following:

One of us has argued that signal s being F can also carry (negative) natural information about o being G by lowering the probability that o is G.

So what would a general definition about carrying natural information look like? It seems to me that something like this would do the job:


If a signal s being F carries natural information about an object o being G, then P(o is G) \neq P(o is G | s is F)

New Book on The Philosophy of Information

http://ukcatalogue.oup.com/product/9780199232383.do

Luciano Floridi presents a book that will set the agenda for the philosophy of information. PI is the philosophical field concerned with (1) the critical investigation of the conceptual nature and basic principles of information, including its dynamics, utilisation, and sciences, and (2) the elaboration and application of information-theoretic and computational methodologies to philosophical problems. This book lays down, for the first time, the conceptual foundations for this new area of research. It does so systematically, by pursuing three goals. Its metatheoretical goal is to describe what the philosophy of information is, its problems, approaches, and methods. Its introductory goal is to help the reader to gain a better grasp of the complex and multifarious nature of the various concepts and phenomena related to information. Its analytic goal is to answer several key theoretical questions of great philosophical interest, arising from the investigation of semantic information.

Some New Links

Have just gone on a Google search and found some notable links.

I picked up Keith Devlin’s Logic and Information last year but am yet to read it; I’m planning to at least skim through it in the near future. In the mean time, I found this course on Logic and Information given by Devlin. Course material is available for download and seems to be quite good.

Also, this link on Information drew my attention. Going by the Stanford Encyclopedia of Philosophy archives, I’m not sure if this article ever made it as a published entry, but it seems to have been a possible candidate for inclusion in the SEP quite some time ago.

‘On Quantifying Semantic Information’ draft

Download: On Quantifying Semantic Information

Abstract: The purpose of this paper is to examine existing methods of semantic information quantification and suggest some alternatives. It begins with an outline of Bar-Hillel and Carnap’s seminal account of semantic information before going on to look at Floridi’s quantitative theory of strongly semantic information. The latter then serves to initiate an in-depth investigation into the idea of utilising the notion of truthlikeness to quantify semantic information. Firstly, a couple of approaches to measuring truthlikeness are drawn from the literature and explored, with a focus on their applicability to semantic information quantification. Secondly, a similar but new approach to measure truthlikeness/information is presented.

False Information

Luciano Floridi holds a truth-based definition of semantic information (veridicality thesis); for X to qualify as an instance of semantic information it must be true, meaningful well-formed data. False information (i.e. misinformation) is not actually information. For more on this see

So what to make of the term ‘false information’? Well, consider the following two terms:

  1. false proposition
  2. false policeman

Continue reading “False Information”