The concept of information and the unity of science

Philosophy of Science 28 (4):406-413 (1961)
  Copy   BIBTEX

Abstract

An attempt is made in this paper to analyze the purely formal nature of information-theoretic concepts. The suggestion follows that such concepts, used to supplement the logical and mathematical structure of the language of science, represent an addition to this language of such a sort as to allow the use of a unitary language for the description of phenomena. (The alternative to this approach must be certain multi-linguistic and mutually untranslatable descriptions of related phenomena, as with the various versions of Complementarity). This conception is tested for the specific case of Heisenberg's Uncertainty Principle, in order to show that, with the assumption of a suitable and intuitively satisfactory definition of the quantity of information contained in a measurement, the Heisenberg Principle becomes an informational restriction arising from the formal properties of the symbols of a given language rather than as a "law" of nature

Other Versions

No versions found

Links

PhilArchive

    This entry is not archived by us. If you are the author and have permission from the publisher, we recommend that you archive it. Many publishers automatically grant permission to authors to archive pre-prints. By uploading a copy of your work, you will enable us to better index it, making it easier to find.

    Upload a copy of this work     Papers currently archived: 104,246

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Analytics

Added to PP
2009-01-28

Downloads
65 (#351,266)

6 months
1 (#1,597,010)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

Information theory and redundancy.Derek Partridge - 1981 - Philosophy of Science 48 (2):308-316.

Add more citations

References found in this work

No references found.

Add more references