Here is a recent talk given by Floridi on Semantic Information. Following is the talk’s abstract:
Semantic information is a very slippery topic. If we know the relevant codes, we patently have no problems in understanding sentences in natural languages, maps, formulae, road signs or other similar instances of well-formed an meaningful data. And yet, information scientists and philosophers have struggled to determine what exactly semantic information is and what it means for an agent to become informed semantically. In this paper, I shall discuss the “three pillars” of the philosophy of information: Shannon’s communication model, the covariance model and the inverse relationship principle. I shall argue that none of them captures our intuitions about semantic information and “becoming informed”. I shall then outline what I hope might be a more promising strategy.