Archive for February, 2010

Transplication as Implication

Wednesday, February 24th, 2010

In his contribution on partial logic to the Handbook of Philosophical Logic, Stephen Blamey introduces a ‘value gap introducing’ connective named ‘transplication’ to the standard 3-valued partial logic, the Strong Kleene logic. Blamey suggests the possibility of reading the transplication connective as a type of conditional. I was interested to see how the transplication connective fares as a conditional by testing it against a list of inferences concerning conditionals.

Here is the investigation: Transplication as Implication

I have not come across much material concerning transplication. Does anyone else have any other references or ideas?

Bar-Hillel and Carnap’s Account of Semantic Information

Friday, February 19th, 2010

Around the middle of the 20th century Rudolf Carnap and Yehoshua Bar-Hillel gave a seminal account of semantic information which falls under the probabilistic approach. Their idea was to measure the semantic information of a statement within a given language in terms of an a priori logical probability space. The general idea is based on the Inverse Relationship Principle, according to which the information value of a proposition is inversely proportional to the probability of that proposition.

Here is a brief overview of their work: Bar-Hillel and Carnap’s Account of Semantic Information

Properties of Information Flow

Monday, February 15th, 2010

What are the properties of information flow? To establish the terminology with which I will pose properties to consider, I start off with the most basic of properties. If A carries the information that B, then A carries the information that B.


A \rightarrow B \vdash A \rightarrow B

Here are two other straightforward properties of information:


((A \rightarrow B) \wedge (A \rightarrow C)) \vdash (A \rightarrow (B \wedge C))


((A \rightarrow C) \wedge (B \rightarrow C)) \vdash ((A \vee B) \rightarrow C)

What further properties are there to consider and which should be accepted and which should be rejected in developing an account of information flow?

(more…)

Digital doomsday: the end of knowledge

Sunday, February 14th, 2010

Interesting article I found in a copy of NewScientist.

We are storing our knowledge in ever more fragile and ephemeral forms. If anything goes wrong, we could lose much of it.

http://www.newscientist.com/article/mg20527451.300-digital-doomsday-the-end-of-knowledge.html

Latex, Logic and Wordpress

Thursday, February 11th, 2010

I am currently writing up some notes in \LaTeX, including jotting down a sequent system. I have typed up a sequent system in \LaTeX before, but could not remember how I did it and cannot find the .tex source for it. So I headed down to the excellent \LaTeX for Logicians to refresh my memory.

How am I typesetting the \LaTeX logo in this post you ask? Well, whilst searching for \LaTeX and logic, I also discovered the possibility of including \LaTeX in Wordpress posts. Here is an article discussing this: http://www.logicnest.com/using-latex-in-wordpress. Great stuff!

Review of The Logic of Being Informed

Tuesday, February 9th, 2010

I have put together a review of Luciano Floridi’s Logic of Being Informed.

It can be downloaded here.

Also, Patrick Allo has an upcoming paper titled The Logic of ‘Being Informed’ Revisited and Revised. This paper takes a closer look at Floridi’s formal analysis, provides a pure and an applied semantics for the logic of being informed, and tries to find out to what extent the formal analysis can contribute to an information-based epistemology.

Dretske’s Account of Knowledge Against Some Epistemological Cases

Tuesday, February 2nd, 2010

Knowledge and luck do not mix. Our intuitions and definitions of knowledge suggest and require the absence of luck in cases of knowledge. Edmund Gettier’s landmark 1963 paper ‘Is Justified True Belief Knowledge?’ not only prompted a revision in epistemological theorising, but gave us the terms Gettier-examples and the related Gettier-luck. Gettier provided his examples in order to refute the account of knowledge which defines it as justified true belief (JTB). Here is one of the two examples he provided. It is supposed that Smith has strong evidence for the following proposition:


Jones owns a Ford (A)

Smith has another friend, Brown, of whose whereabouts he is totally ignorant. Smith randomly selects the names of three cities and uses them to construct the following three propositions:

  • Either Jones owns a Ford, or Brown is in Boston. (B)
  • Either Jones owns a Ford, or Brown is in Barcelona. (C)
  • Either Jones owns a Ford, or Brown is in Brest-Litovsk. (D)

now, each of B, C and D is entailed by A so Smith comes to accept them. Smith therefore has correctly inferred B, C, and D from a proposition A for which he has strong justification. Hence Smith is justified in having the true beliefs of B, C and D. Now imagine in the scenario that firstly Jones does not own a Ford, but is instead at present driving a rented car. Secondly, by sheer coincidence and unknown to Smith, Barcelona happens to be where Brown is. So even though Smith clearly does not know that C is true, it is true, he believes it and he is justified in believing it.

This example along with the other example in the paper sufficed to show that truth, belief and justification were not sufficient conditions for knowledge. In both of Gettier’s actual examples, the justified true belief came about as the result of entailment from justified false beliefs; in the given example the justified false belief that “Jones owns a Ford”. This led some early responses to Gettier to conclude that the definition of knowledge could be easily adjusted, so that knowledge was justified true belief that depends on no false premises. This “no false premises” solution did not settle the matter however, as more general Gettier-style problems were then constructed or contrived, in which the justified true belief does not result using a chain of reasoning from a justified false belief.

(more…)