Detecting Introspective Errors in Consciousness Science

Ergo: An Open Access Journal of Philosophy (forthcoming)
  Copy   BIBTEX

Abstract

Detecting introspective errors about consciousness presents challenges that are widely supposed to be difficult, if not impossible, to overcome. This is a problem for consciousness science because many central questions turn on when and to what extent we should trust subjects’ introspective reports. This has led some authors to suggest that we should abandon introspection as a source of evidence when constructing a science of consciousness. Others have concluded that central questions in consciousness science cannot be answered via empirical investigation. I argue that on closer inspection, the challenges associated with detecting introspective errors can be overcome. I demonstrate how natural kind reasoning—the iterative application of inference to the best explanation to home in on and leverage regularities in nature—can allow us to detect introspective errors even in difficult cases such as judgments about mental imagery, and I conclude that worries about intractable methodological challenges in consciousness science are misguided.

Other Versions

No versions found

Links

PhilArchive

External links

  • This entry has no external links. Add one.
Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Analytics

Added to PP
2024-12-19

Downloads
50 (#431,619)

6 months
50 (#100,371)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Andy Mckilliam
National Taiwan University

References found in this work

No references found.

Add more references