Entropy and information: Suggestions for common language

Philosophy of Science 54 (2):176-193 (1987)
  Copy   BIBTEX

Abstract

Entropy and information are both emerging as currencies of interdisciplinary dialogue, most recently in evolutionary theory. If this dialogue is to be fruitful, there must be general agreement about the meaning of these terms. That this is not presently the case owes principally to the supposition of many information theorists that information theory has succeeded in generalizing the entropy concept. The present paper will consider the merits of the generalization thesis, and make some suggestions for restricting both entropy and information to specific arenas of discourse

Other Versions

No versions found

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 103,343

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Analytics

Added to PP
2009-01-28

Downloads
296 (#96,490)

6 months
15 (#168,777)

Historical graph of downloads
How can I increase my downloads?

References found in this work

Philosophical investigations.Ludwig Wittgenstein & G. E. M. Anscombe - 1953 - Revue Philosophique de la France Et de l'Etranger 161:124-124.
Science and the Modern World.Alfred North Whitehead - 1925 - Humana Mente 1 (3):380-385.
What ls Life.Erwin Schroedinger - forthcoming - Mind and Matter.
A Mathematical Theory of Communication.Claude Elwood Shannon - 1948 - Bell System Technical Journal 27 (April 1924):379–423.
From Being to Becoming.I. Prigogine - 1982 - British Journal for the Philosophy of Science 33 (3):325-329.

View all 6 references / Add more references