Abstract
Many of us consider it uncontroversial that information processing is a natural function of the brain. Since functions in biology are only won through empirical investigation, there should be a significant body of unambiguous evidence that supports this functional claim. Before we can interpret the evidence, however, we must ask what it means for a biological system to process information. Although a concept of information is generally accepted in the neurosciences without critique, in other biological sciences applications of information, despite careful analysis, remain controversial. In this work I will review classical stimulus-response studies in neuroscience and use Claude Shannon’s mathematical information theory as a starting point to interpret information processing as a function of the brain. I will illustrate a disanalogy between Shannon’s communication model (source, encode, channel, receiver, decode) and neural systems, and will argue that the neural code is not very code-like in comparison to genetic and engineered codes. I suggest that we have conflated the act of representing neuroscientific facts—which we do to summarize and communicate our findings with others—with taking experimental facts to be representations.