Abstract
The lore is that standard information theory provides an analysis of information quantity, but not of information content. I argue this lore is incorrect, and there is an adequate informational semantics latent in standard theory. The roots of this notion of content can be traced to the secret parallel development of an information theory equivalent to Shannon’s by Turing at Bletchley Park, and it has been suggested independently in recent work by Skyrms and Bullinaria and Levy. This paper explicitly articulates the semantics latent in information theory and defends it as an adequate theory of information content, or natural meaning. I argue that this theory suggests a new perspective on the classic misrepresentation worry for correlation-based semantics.