A Critical Analysis of Floridi’s Theory of Semantic Information

I have recently been reading ‘A Critical Analysis of Floridi’s Theory of Semantic Information’, Pieter Adriaans’ contribution to the special issue of Knowledge, Technology & Policy titled ‘Luciano Floridi’s Philosophy of Technology: Critical Reflections’. This article is indeed quite critical of Floridi’s Theory of Semantic Information.

Floridi responded in turn, and his replies to the contributions can be found in the piece ‘The Philosophy of Information as a Conceptual Framework’.

One point Floridi makes is that Adriaans’ proposal “belongs to an old-fashioned, perfectly respectable but also bankrupted tradition of attempting to squeeze semantics out of syntax”. Whilst there are connections between syntax and semantics to be investigated, I’m also not sure about the prospects of completely reducing semantics to syntax. One example giving by Adriaans particularly caught my attention.

Modern information theory has much more powerful techniques to analyze meaning. Conditional probabilities can perfectly be used to analyze the validity of definitions for instance: if P(John is a bachelor | John is an unmarried male) = P(John is an unmarried make | John is a bachelor) = 1 this indicates that the messages ‘John is a bachelor’ and ‘John is an unmarried male’ in the context of this specific set of messages not only carry the same amount of information, but have the same meaning. Thus a formal treatment of semantic information in the context of classical information theory seems possible.

This just seems wrong. Although such a conditional probability requirement is a necessary condition for meaning equivalence, it surely is not a sufficient condition. Such an approach would deliver a ridiculously coarse-grained account of meaning.

Anyway, I don’t see why theories of semantic information and an entropy-based notion of information need to `compete’. Adriaans concludes his article with the following point. Whilst information in the ordinary sense can be defined as meaningful and truthful, well-formed data, the rich toolbox of modern information theory can well be used to explain things like truthfulness and the notion of meaning. Consequently,

This makes the notion of semantic information from the perspective of information theory somewhat superfluous. One cannot forbid philosophers to study it, but it would be somewhat of a caricature to present such research under the ambitious label of ‘the philosophy of information’. The risk is that the real questions that are in the heart of philosophy of information get less attention than they deserve. Instead of studying semantic information, I urge students of philosophy of information to direct there endeavours to [he lists some open problems in information theory].

There are a few points to make about this, first amongst them Adriaans’ italicisation of ‘the’ in the above quote. Perhaps the tension would be alleviated if the research program was changed to ‘A Philosophy of Information’. But anyway, whilst semantic information is a key component of the philosophy of information (PI) framework, it accommodates information in general. Whilst the sufficiency of information theory (i.e. mathematical theory of communication) for PI has been argued against, its necessity has been argued for; information theory provides the scientific constraints for an interesting philosophy of information.

Also, consider the definition of PI: philosophy of information is the philosophical field concerned with

  1. the critical investigation of the conceptual nature and basic principles of information, including its dynamics, utilisation and sciences
  2. the elaboration and application of information-theoretic and computational methodologies to philosophical problems.

Perhaps Adriaans is concerned that semantic information is stealing limelight away from entropy-based notions of information under the philosophy of information banner. But as he mentions, these notions of information are known from fields such physics, information theory and computer science. This fits in with point (1) in the above definition: such conceptions of information within other disciplines can and have rightly so been of philosophical interest.

Regarding point (2), surely philosophers have a right to apply relevant semantic conceptions of information to philosophical problems they are tackling. Bar-Hillel, Carnap and Hintikka were interested in developing a logical approach to semantic information theory. Hintikka and others are interested in logic and information. Dretske and other epistemologists are interested in developing information-based accounts of knowledge.

Therefore, despite Adriaans’ urgings, I would say that students of philosophy of information can legitimately pursue endeavours concerning philosophy-related semantic conceptions of information.

3 Responses to “A Critical Analysis of Floridi’s Theory of Semantic Information”

  1. Nick Byrd says:

    Thanks for this post. There is not enough about PI on the blogosphere (at least, not that I know of). I enjoy finding some here. If you know of other PI blogs, I hope you will point me in their direction. Thanks again!

  2. “Such an approach would deliver a ridiculously coarse-grained account of meaning.” I fear the author fundamentally misunderstands the representational power of probability distributions. Read the seminal papers by Ray Solomonov who describes the progress of human knowledge as an updates on the universal distribution. Please try to familiarise yourself with the last 50 years of computer science research before you write these comments. I object to the ‘straw man’ created by Floridi and authors like the ones of this post that I want to reduce semantics to syntax, This is simply not true: my views on the matter are much more subtle and can be checked here: http://plato.stanford.edu/entries/information/

Leave a Reply