In Information Without Truth, the Veridicality Thesis for both natural (environmental) and non-natural (semantic) information is rejected.
The Veridicality Thesis for Natural Information (VTN) can be stated:
(VTN) If a signal s being F carries natural information about an object o being G, then o is G.
This is like Dretske’s definition of information flow, and it entails that if an agent A has natural information about an object o being G when they receive a signal of s being F, then o is G.
Contrary to this, in Information Without Truth a Probability Raising Thesis for Natural Information (PRTN) is given, where the transmission of natural information involves nothing more than the truth of the following probabilistic claim:
(PRTN) If a signal s being F carries natural information about an object o being G, then P(o is G | s is F) > P(o is G | (s is F))
It should be noted that they footnote this definition with the following:
One of us has argued that signal s being F can also carry (negative) natural information about o being G by lowering the probability that o is G.
So what would a general definition about carrying natural information look like? It seems to me that something like this would do the job:
If a signal s being F carries natural information about an object o being G, then P(o is G) P(o is G | s is F)
Great blog.
Your suggested general definition looks reasonable to me. I am not sure why the authors did not include it.
I find their proposal quite appealing but where does that leave a theory of information outside of everyday probability theory? I am not familiar enough with the issues to say, so that is a sincere question.
As for the disjunction problem, by which I mean the problem of misrepresentation in informational semantics, the fact that s can be F but o not be G seems to make that problem less urgent. And it would seem that signals about analytic truths and nomic necessities would still carry no information. That is not necessarily a problem.
Barwise warned us that the content of utterances like ‘Londres is London’ or ‘Morning Star=Evening Star’, or even ‘p v \neg p’ each have more than one informational content, and that these might be sources of conflicting intuitions. In the first case, isn’t the information being carried about a relationship between the vehicles themselves, rather than the planet they designate? What is the probability that one string of characters is an alias for the object designated by another string of characters? Surely not unity…
Thanks for the comments. Some replies below:
1. You write “I find their proposal quite appealing but where does that leave a theory of information outside of everyday probability theory?”
The authors distinguish between `information that p‘ and `information about p‘, with their theory dealing with the `information about‘ type. Perhaps a theory of information outside of everyday probability theory would deal with the `information that‘ type.
2. Yes, it would be interesting to see how the proposal in this paper could be applied to the problem of misrepresentation in informational semantics.
3. Signals about analytic truths and nomic necessities would still carry no information. But unlike, say Dretske’s definition, where use of the variable k (agent’s background knowledge) is needed to get an account where analytic truths and nomic necessities carry no information, with the new definition given in this paper, no reference to such an extra factor is needed.
This is simply because it is always the case that Pr(p v ~p | s is F ) = Pr(p v ~p | ~ (s is F)); one can never be greater than the other.
4. Analytic truths and identities about nomic necessities can certainly have more than one informational content. There can be different levels of information; on one level something might not give any information but on another level it might.