Semantic Conceptions of Information [REVISED: January 7, 2015]
This review by Searle of Floridi’s The Fourth Revolution is available to subscribers only at the New York Review of Books, but a copy can be found as a reading here:
Floridi’s response to Searle and Searle’s response to Floridi’s response can be read here:
Interesting; a subreddit created to host a discussion group for reading Floridi’s The Philosophy of Information: http://redd.it/2r7nvx.
Here are some reports generated from the Philosophy of Information section at philpapers.org:
DTMD 2015 will focus on Information and values: ethics, spirituality and religion.
Conceptual challenges of data in science and technology: http://socphilinfo.org/news/cfp/485-seventh-workshop-philosophy-information-conceptual-challenges-data-science-and
In October I will be heading up to Wollongong to present at CaféDSL, a weekly research seminar hosted by the Decision Systems Lab in the University of Wollongong School of Computer Science and Software Engineering.
Date and Time: Tuesday 21st, October, 2014. 4pm.
Venue: 6.105 – Smart Building
Title: Revising beliefs towards the truth
Abstract: Traditionally the field of belief revision has been mainly concerned with the relations between sentences (pieces of data) and the logical coherence of revision operations without as much concern for whether the dataset resulting from a belief revision operation has epistemically valuable properties such as truth and relevance. Gardenfors for example, who developed the predominant AGM framework for belief revision, argues that the concepts of truth and falsity become irrelevant for the analysis of belief change as “many epistemological problems can be attacked without using the notions of truth and falsity”. However this may be, given that agents process incoming data with the goal of using it for successful action, this lacuna between belief revision and epistemic utilities such as truth and relevance merits attention.
In this talk I address this issue by presenting some preliminary results concerning the combination of formal truthlikeness/verisimilitude measures with belief revision/merging.
Info-Metrics Institute | Workshop, Spring 2015: http://www.american.edu/cas/economics/info-metrics/workshop/workshop-2015-spring.cfm.
As has been established in the literature, given some truthlikeness/verisimilitude measure Tr(), theory T and evidence E, we can measure the estimated truthlikeness of T given E with:
for each state in the logical space.
Now, using a Bayesian confirmation measure such as the following:
we can combine it with the estimated truthlikeness measure to get a measure of truthlikeness confirmation:
So what can be done with this measure? In A Verosimilitudinarian Analysis of the Linda Paradox, the authors suggest this measure for what they term a ‘verisimilitudinarian confirmation account’ of the Linda paradox (they do so in response to a problem with an earlier proposal of theirs that gives an account of the paradox based on estimated truthlikeness alone). But it seems that this approach is doing nothing that an account of the Linda paradox in terms of confirmation alone isn’t already doing.
Thus it would be interesting to think about this idea of truthlikeness confirmation some more. For starters, clearly confirmation and truthlikeness confirmation do not increase/decrease together. Take a logical space with three propositions p1, p2 and p3 and a uniform a priori probability distribution amongst the eight possible states:
- Whilst confirms it results in a negative truthlikeness confirmation.
- Whilst disconfirms it results in a positive truthlikeness confirmation.