One issue with Bar-Hillel and Carnap’s account of semantic information is that it assigns maximal informativeness to contradictions, an issue that has been termed the Bar-Hillel-Carnap Paradox. What happens if we replace the underlying classical logic and probability with the paraconsistent LP (Logic of Paradox)? Does it resolve the Bar-Hillel-Carnap Paradox? Here is an investigation into the matter:
Paraconsistent Semantic Information
In his contribution on partial logic to the Handbook of Philosophical Logic, Stephen Blamey introduces a ‘value gap introducing’ connective named ‘transplication’ to the standard 3-valued partial logic, the Strong Kleene logic. Blamey suggests the possibility of reading the transplication connective as a type of conditional. I was interested to see how the transplication connective fares as a conditional by testing it against a list of inferences concerning conditionals.
Here is the investigation: Transplication as Implication
I have not come across much material concerning transplication. Does anyone else have any other references or ideas?
Around the middle of the 20th century Rudolf Carnap and Yehoshua Bar-Hillel gave a seminal account of semantic information which falls under the probabilistic approach. Their idea was to measure the semantic information of a statement within a given language in terms of an a priori logical probability space. The general idea is based on the Inverse Relationship Principle, according to which the information value of a proposition is inversely proportional to the probability of that proposition.
Here is a brief overview of their work: Bar-Hillel and Carnap’s Account of Semantic Information
What are the properties of information flow? To establish the terminology with which I will pose properties to consider, I start off with the most basic of properties. If A carries the information that B, then A carries the information that B.
Here are two other straightforward properties of information:
What further properties are there to consider and which should be accepted and which should be rejected in developing an account of information flow?
Continue reading “Properties of Information Flow”
Interesting article I found in a copy of NewScientist.
We are storing our knowledge in ever more fragile and ephemeral forms. If anything goes wrong, we could lose much of it.
I have put together a review of Luciano Floridi’s Logic of Being Informed.
It can be downloaded here.
Also, Patrick Allo has an upcoming paper titled The Logic of ‘Being Informed’ Revisited and Revised. This paper takes a closer look at Floridi’s formal analysis, provides a pure and an applied semantics for the logic of being informed, and tries to find out to what extent the formal analysis can contribute to an information-based epistemology.
Knowledge and luck do not mix. Our intuitions and definitions of knowledge suggest and require the absence of luck in cases of knowledge. Edmund Gettier’s landmark 1963 paper ‘Is Justified True Belief Knowledge?’ not only prompted a revision in epistemological theorising, but gave us the terms Gettier-examples and the related Gettier-luck. Gettier provided his examples in order to refute the account of knowledge which defines it as justified true belief (JTB). Here is one of the two examples he provided. It is supposed that Smith has strong evidence for the following proposition:
Jones owns a Ford (A)
Smith has another friend, Brown, of whose whereabouts he is totally ignorant. Smith randomly selects the names of three cities and uses them to construct the following three propositions:
- Either Jones owns a Ford, or Brown is in Boston. (B)
- Either Jones owns a Ford, or Brown is in Barcelona. (C)
- Either Jones owns a Ford, or Brown is in Brest-Litovsk. (D)
now, each of B, C and D is entailed by A so Smith comes to accept them. Smith therefore has correctly inferred B, C, and D from a proposition A for which he has strong justification. Hence Smith is justified in having the true beliefs of B, C and D. Now imagine in the scenario that firstly Jones does not own a Ford, but is instead at present driving a rented car. Secondly, by sheer coincidence and unknown to Smith, Barcelona happens to be where Brown is. So even though Smith clearly does not know that C is true, it is true, he believes it and he is justified in believing it.
This example along with the other example in the paper sufficed to show that truth, belief and justification were not sufficient conditions for knowledge. In both of Gettier’s actual examples, the justified true belief came about as the result of entailment from justified false beliefs; in the given example the justified false belief that “Jones owns a Ford”. This led some early responses to Gettier to conclude that the definition of knowledge could be easily adjusted, so that knowledge was justified true belief that depends on no false premises. This “no false premises” solution did not settle the matter however, as more general Gettier-style problems were then constructed or contrived, in which the justified true belief does not result using a chain of reasoning from a justified false belief.
Continue reading “Dretske’s Account of Knowledge Against Some Epistemological Cases”
The alethic nature of semantic information has been, and continues to be a point of contention. At the very least semantic information is understood as semantic content; that is, meaningful, well-formed data. Defenders of the alethic neutrality of semantic information argue that semantic content already qualifies as information, regardless of whether it is true, false or has no alethic value at all. Opponents hold that not just any semantic content qualifies as information. For semantic content to qualify as information, it must also be true; false information or misinformation is not actually a kind of information.
Continue reading “Information and Truth”