Can Misinformation Be Informative?

Can misinformation be informative and if so, how? Here are some thoughts on the matter, with three ways in which misinformation can be considered informative. The first two are simply rehashes of standard facts about truth, falsity and logic explicated in informational parlance. The third is a novel point.

Continue reading “Can Misinformation Be Informative?”

Information and Information Flow

For a decent introduction to theories of information and information flow, with a focus on semantic information and approaches from within philosophy, I recommend Information and Information Flow: An Introduction. Authored by Manuel Bremer and Daniel Cohnitz, it is based on a series of lectures they gave some years ago.

When I first started looking at information, I found this book to be very helpful and informative. One issue I have with the book though is that it seems to have been put together a little too hastily and as a result is somewhere in between a collection of lecture notes and a refined, replete book. Primarily it suffers from some inadequate explanations and some awkward material flow.

Paraconsistent Semantic Information

One issue with Bar-Hillel and Carnap’s account of semantic information is that it assigns maximal informativeness to contradictions, an issue that has been termed the Bar-Hillel-Carnap Paradox. What happens if we replace the underlying classical logic and probability with the paraconsistent LP (Logic of Paradox)? Does it resolve the Bar-Hillel-Carnap Paradox? Here is an investigation into the matter:

Paraconsistent Semantic Information

Information: A Very Short Introduction is Available

Information: A Very Short Introduction, Luciano Floridi’s contribution to the ‘Very Short Introductions’ series is now available for purchase:

Bar-Hillel and Carnap’s Account of Semantic Information

Around the middle of the 20th century Rudolf Carnap and Yehoshua Bar-Hillel gave a seminal account of semantic information which falls under the probabilistic approach. Their idea was to measure the semantic information of a statement within a given language in terms of an a priori logical probability space. The general idea is based on the Inverse Relationship Principle, according to which the information value of a proposition is inversely proportional to the probability of that proposition.

Here is a brief overview of their work: Bar-Hillel and Carnap’s Account of Semantic Information

Properties of Information Flow

What are the properties of information flow? To establish the terminology with which I will pose properties to consider, I start off with the most basic of properties. If A carries the information that B, then A carries the information that B.

A \rightarrow B \vdash A \rightarrow B

Here are two other straightforward properties of information:

((A \rightarrow B) \wedge (A \rightarrow C)) \vdash (A \rightarrow (B \wedge C))

((A \rightarrow C) \wedge (B \rightarrow C)) \vdash ((A \vee B) \rightarrow C)

What further properties are there to consider and which should be accepted and which should be rejected in developing an account of information flow?

Continue reading “Properties of Information Flow”

Review of The Logic of Being Informed

I have put together a review of Luciano Floridi’s Logic of Being Informed.

It can be downloaded here.

Also, Patrick Allo has an upcoming paper titled The Logic of ‘Being Informed’ Revisited and Revised. This paper takes a closer look at Floridi’s formal analysis, provides a pure and an applied semantics for the logic of being informed, and tries to find out to what extent the formal analysis can contribute to an information-based epistemology.