Quantifying Information to Quantifying Beliefs

In the previous post, I showed how a truthlikeness semantic information measure could be applied in order to get a basic way in which knowledge can be quantitatively measured. This advantage set the truthlikeness approach apart from inverse probabilistic approaches to quantifying semantic information.

In looking at a way to work out something similar for quantitatively measuring beliefs, it occurred to me that things are the other way around; it is the inverse probabilistic approach instead which is to be applied.

Continue reading “Quantifying Information to Quantifying Beliefs”

Quantifying Information to Quantifying Knowledge

Part of my current research has focused on the quantification of semantic information, on ways to measure the semantic information yield of logical statements. Instead of the somewhat standard Bar-Hillel/Carnap/Hintikka inverse probabilistic approach, I have opted to quantify semantic information using the notion of truthlikeness. The former is associated with a Theory of Weakly Semantic Information (TWSI), `weakly’ because truth values play no role in it. The latter is associated with a Theory of Strongly Semantic Information (TSSI), according to which information encapsulates truth. TSSI is associated more generally with the veridicality thesis, that semantic information is meaningful, well-formed data that is also true. See On Quantifying Semantic Information for more on this.

Continue reading “Quantifying Information to Quantifying Knowledge”

Agent-Relative Informativeness: Combining truthlikeness semantic information measures with belief revision

Here is the extended abstract for a paper I am trying to finish:

In recent work [1] a quantitative account of semantic information based on truthlikeness measures was proposed; statement A yields more information than statement B when A contains more truth or is closer to the whole truth than B. Given a way to measure the information yield of a statement A, an important distinction to make is that between the information yield of A and its informativeness relative to a particular agent. As the simplest of examples, take three true propositions p1, p2 and p3. Although the statement p1 & p2 has a greater quantitative information measure than p3, given an agent that already has p1 & p2 in their database, the statement p3 is going to be more informative than p1 & p2 for that agent.

In this paper the idea of agent-relative informativeness is explored. How informative some input statement is for an agent will not only depend upon its (1) information measure, but also on (2) what content the agent already has and also on what they do with the input if they accept it. In order to deal with (2), some framework for belief revision is needed. Thus agent-relative informativeness here involves a combination of truthlikeness/information measures and belief revision. As it so happens, within the last few years there has been some interest in investigating the relationship between the truthlikeness (verisimilitude) and belief revision programs [2, 3]. Continuing on from [1], in this paper the approaches to truthlikeness/information focused on are the Tichy/Oddie and Value Aggregate methods. The belief revision framework employed is predominantly the AGM one, though some alternatives are considered.

Apart from a general outline of the ideas associated with agent-relative informativeness, two further contributions of this paper are:

  • Some results on the behaviour of the belief revision operations of expansion, revision and contraction with regards to truthlikeness information measurements.
  • Suggestion of some formal frameworks to deal with conflicting sources of input, at least some of which by definition are going to be providing misinformation. This includes (1) combining screened belief revision with estimated information measurement and (2) development of a paraconsistent approach to belief revision.

References
[1] S. D’Alfonso. On quantifying semantic information. Information, 2(1):61 – 101, 2011. URL = <http://www.mdpi.com/2078-2489/2/1/61/>.
[2] Cevolani Gustavo and Franceso Calandra. Approaching the truth via belief change in propositional languages. In EPSA Epistemology and Methodology of Science: Launch of the European Philosophy of Science Association, pages 47 – 62, 2010.
[3] I. Niiniluoto. Theory change, truthlikeness, and belief revision. In EPSA Epistemology and Methodology of Science: Launch of the European Philosophy of Science Association, pages 189 – 199, 2010.