Results for 'Entropy (Information theory) '

190 found
Order:
  1.  31
    New Foundations for Information Theory: Logical Entropy and Shannon Entropy.David Ellerman - 2021 - Springer Verlag.
    This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent (...)
  2. Logical Entropy: Introduction to Classical and Quantum Logical Information theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  3.  54
    Mathematical foundations of information theory.Aleksandr I͡Akovlevich Khinchin - 1957 - New York,: Dover Publications.
    First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition.
    Direct download  
     
    Export citation  
     
    Bookmark   31 citations  
  4. Further on informational quanta, interactions, and entropy under the granular view of value formation.Quan-Hoang Vuong & Minh-Hoang Nguyen - 2024 - SSRN.
    A recent study suggests that value and quantum states seem to be governed by the same underlying mechanisms. In our recent book titled "Better economics for the Earth: A lesson from quantum and information theories," specifically Chapter 5, we have proposed an informational entropy-based notion of value, grounded in Granular Interaction Thinking Theory (GITT), which integrates granular worldview and primary features of quantum mechanics, Shannon’s information theory, and the mindsponge theory. Specifically, the notion suggests (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   68 citations  
  5. Generalized Information Theory Meets Human Cognition: Introducing a Unified Framework to Model Uncertainty and Information Search.Vincenzo Crupi, Jonathan D. Nelson, Björn Meder, Gustavo Cevolani & Katya Tentori - 2018 - Cognitive Science 42 (5):1410-1456.
    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   17 citations  
  6.  70
    Logical information theory: new logical foundations for information theory.David Ellerman - 2017 - Logic Journal of the IGPL 25 (5):806-835.
    There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory. Information is first defined in terms of sets of distinctions without using any probability measure. When a probability measure is introduced, the logical entropies are simply the values (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  7. Characterizing Entropy in Statistical Physics and in Quantum Information Theory.Bernhard Baumgartner - 2014 - Foundations of Physics 44 (10):1107-1123.
    A new axiomatic characterization with a minimum of conditions for entropy as a function on the set of states in quantum mechanics is presented. Traditionally unspoken assumptions are unveiled and replaced by proven consequences of the axioms. First the Boltzmann–Planck formula is derived. Building on this formula, using the Law of Large Numbers—a basic theorem of probability theory—the von Neumann formula is deduced. Axioms used in older theories on the foundations are now derived facts.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  8. Informational entropy-based value formation: A new paradigm for a deeper understanding of value.Quan-Hoang Vuong, Viet-Phuong La & Minh-Hoang Nguyen - 2025 - I.E.V.25.
    The major global challenges of our time, like climate and environmental crises, rising inequality, the emergence of disruptive technologies, etc., demand interdisciplinary research for effective solutions. A clear understanding of value is essential for guiding socio-cultural and economic transitions to address these issues. Despite numerous attempts to define value, existing approaches remain inconsistent across disciplines and lack a comprehensive framework. This paper introduces a novel perspective on value through the lens of granular interaction thinking theory, proposing an informational (...)-based notion of value. Grounded on quantum mechanics, Shannon’s information theory, and the mindsponge theory, this framework integrates both subjective and objective considerations and is highly compatible with interdisciplinary research. The informational entropy-based notion of value effectively bridges diverse concepts of value, including use and exchange value in economics, personal values in psychology, and cultural, moral, ethical, and aesthetic values in society. By offering a unifying perspective, granular interaction thinking theory provides a valuable framework for translating insights from quantum mechanics into socio-cultural, economic, and psychological contexts, enriching theoretical discourse and enhancing analytical effectiveness. (shrink)
    Direct download  
     
    Export citation  
     
    Bookmark   10 citations  
  9. Information Theories with Adversaries, Intrinsic Information, and Entanglement.Karol Horodecki, Michał Horodecki, Pawel Horodecki & Jonathan Oppenheim - 2005 - Foundations of Physics 35 (12):2027-2040.
    There are aspects of privacy theory that are analogous to quantum theory. In particular one can define distillable key and key cost in parallel to distillable entanglement and entanglement cost. We present here classical privacy theory as a particular case of information theory with adversaries, where similar general laws hold as in entanglement theory. We place the result of Renner and Wolf—that intrinsic information is lower bound for key cost—into this general formalism. Then (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  10. The use of information theory in epistemology.William F. Harms - 1998 - Philosophy of Science 65 (3):472-501.
    Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  11.  31
    Information theory, quantum mechanics and‘linguistic duality’.C. T. K. Chari - 1966 - Dialectica 20 (1):67-88.
    – The paper explores first the postulational basis and significance of‘measures of information’in current information theory and their possible relations to physical entropy and Brillouin's‘negentropy’regarded as the negative of entropy. For some purposes, the same pattern or formal structure may be abstracted from both‘entropy’and‘information’. The paper analyzes, in the second place, the mathematical analogies which have been traced between information theory and quantum mechanics and argues that the analogies have but a (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  12. Kolmogorov complexity and information theory. With an interpretation in terms of questions and answers.Peter D. Grünwald & Paul M. B. Vitányi - 2003 - Journal of Logic, Language and Information 12 (4):497-529.
    We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (``algorithmic'') mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  13.  97
    An algorithmic information theory challenge to intelligent design.Sean Devine - 2014 - Zygon 49 (1):42-65.
    William Dembski claims to have established a decision process to determine when highly unlikely events observed in the natural world are due to Intelligent Design. This article argues that, as no implementable randomness test is superior to a universal Martin-Löf test, this test should be used to replace Dembski's decision process. Furthermore, Dembski's decision process is flawed, as natural explanations are eliminated before chance. Dembski also introduces a fourth law of thermodynamics, his “law of conservation of information,” to argue (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  14.  48
    The application of algorithmic information theory to noisy patterned strings.Sean Devine - 2006 - Complexity 12 (2):52-58.
    Although algorithmic information theory provides a measure of the information content of string of characters, problems of noise and noncomputability emerge. However, if pattern in a noisy string is recognized by reference to a set of similar strings, this article shows that a compressed algorithmic description of a noisy string is possible and illustrates this with some simple examples. The article also shows that algorithmic information theory can quantify the information in complex organized systems (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  15. Exploring the role of rejection in scholarly knowledge production: Insights from granular interaction thinking and information theory.Quan-Hoang Vuong & Minh-Hoang Nguyen - 2024 - Learned Publishing 37 (4):e1636.
    Rejection is an essential part of the scholarly publishing process, acting as a filter to distinguish between robust and less credible scientific works. While rejection helps reduce entropy and increase the likelihood of disseminating useful knowledge, the process is not devoid of subjectivity. Providing more informative rejection letters and encouraging humility among editors and reviewers are essential to enhance the efficiency of knowledge production as they help ensure that valuable scientific contributions are not overlooked.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  16. The Fundamental Tension in Integrated Information Theory 4.0’s Realist Idealism.Ignacio Cea - 2023 - Entropy 25 (10).
    Integrated Information Theory (IIT) is currently one of the most influential scientific theories of consciousness. Here, we focus specifically on a metaphysical aspect of the theory’s most recent version (IIT 4.0), what we may call its idealistic ontology, and its tension with a kind of realism about the external world that IIT also endorses. IIT 4.0 openly rejects the mainstream view that consciousness is generated by the brain, positing instead that consciousness is ontologically primary while the physical (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  17. Counting distinctions: on the conceptual foundations of Shannon’s information theory.David Ellerman - 2009 - Synthese 168 (1):119-149.
    Categorical logic has shown that modern logic is essentially the logic of subsets (or "subobjects"). Partitions are dual to subsets so there is a dual logic of partitions where a "distinction" [an ordered pair of distinct elements (u,u′) from the universe U ] is dual to an "element". An element being in a subset is analogous to a partition π on U making a distinction, i.e., if u and u′ were in different blocks of π. Subset logic leads to finite (...)
    Direct download (11 more)  
     
    Export citation  
     
    Bookmark   18 citations  
  18. Entropy and information: Suggestions for common language.Jeffrey S. Wicken - 1987 - Philosophy of Science 54 (2):176-193.
    Entropy and information are both emerging as currencies of interdisciplinary dialogue, most recently in evolutionary theory. If this dialogue is to be fruitful, there must be general agreement about the meaning of these terms. That this is not presently the case owes principally to the supposition of many information theorists that information theory has succeeded in generalizing the entropy concept. The present paper will consider the merits of the generalization thesis, and make some (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  19. How to conceptual engineer ‘entropy’ and ‘information’.Javier Anta - 2025 - Erkenntnis:1-25.
    In this paper I discuss how to conceptual engineer ‘entropy’ and ‘information’ as they are used in information theory and statistical mechanics. Initially, I evaluate the extent to which the all-pervasive entangled use of entropy and information notions can be somehow defective in these domains, such as being meaningless or generating confusion. Then, I assess the main ameliorative strategies to improve this defective conceptual practice. The first strategy is to substitute the terms ‘entropy (...)
    No categories
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  20.  37
    Information, Genetics and Entropy.Julio Ernesto Rubio Barrios - 2015 - Principia: An International Journal of Epistemology 19 (1):121.
    The consolidation of the informational paradigm in molecular biology research concluded on a system to convert the epistemic object into an operational technological object and a stable epistemic product. However, the acceptance of the informational properties of genetic acids failed to clarify the meaning of the concept of information. The “information”’ as a property of the genetic molecules remained as an informal notion that allows the description of the mechanism of inheritance, but it was not specified in a (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark  
  21.  48
    Identifying changes in EEG information transfer during drowsy driving by transfer entropy.Chih-Sheng Huang, Nikhil R. Pal, Chun-Hsiang Chuang & Chin-Teng Lin - 2015 - Frontiers in Human Neuroscience 9:160159.
    Drowsy driving is a major cause of automobile accidents. Previous studies used neuroimaging based approaches such as analysis of electroencephalogram (EEG) activities to understand the brain dynamics of different cortical regions during drowsy driving. However, the coupling between brain regions responding to this vigilance change is still unclear. To have a comprehensive understanding of neural mechanisms underlying drowsy driving, in this study we use transfer entropy, a model-free measure of effective connectivity based on information theory. We investigate (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  22. Entropy and information in evolving biological systems.Daniel R. Brooks, John Collier, Brian A. Maurer, Jonathan D. H. Smith & E. O. Wiley - 1989 - Biology and Philosophy 4 (4):407-432.
    Integrating concepts of maintenance and of origins is essential to explaining biological diversity. The unified theory of evolution attempts to find a common theme linking production rules inherent in biological systems, explaining the origin of biological order as a manifestation of the flow of energy and the flow of information on various spatial and temporal scales, with the recognition that natural selection is an evolutionarily relevant process. Biological systems persist in space and time by transfor ming energy from (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  23. Entropy in evolution.John Collier - 1986 - Biology and Philosophy 1 (1):5-24.
    Daniel R. Brooks and E. O. Wiley have proposed a theory of evolution in which fitness is merely a rate determining factor. Evolution is driven by non-equilibrium processes which increase the entropy and information content of species together. Evolution can occur without environmental selection, since increased complexity and organization result from the likely capture at the species level of random variations produced at the chemical level. Speciation can occur as the result of variation within the species which (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   24 citations  
  24. Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford, GB: Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   38 citations  
  25.  47
    A short note on the logico-conceptual foundations of information theory in partition logic.David Ellerman - 2009 - The Reasoner 3 (7):4-5.
    A new logic of partitions has been developed that is dual to ordinary logic when the latter is interpreted as the logic of subsets of a fixed universe rather than the logic of propositions. For a finite universe, the logic of subsets gave rise to finite probability theory by assigning to each subset its relative size as a probability. The analogous construction for the dual logic of partitions gives rise to a notion of logical entropy that is precisely (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  26. An introduction to logical entropy and its relation to Shannon entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  27.  60
    Insufficient Reason and Entropy in Quantum Theory.Ariel Caticha - 2000 - Foundations of Physics 30 (2):227-251.
    The objective of the consistent-amplitude approach to quantum theory has been to justify the mathematical formalism on the basis of three main assumptions: the first defines the subject matter, the second introduces amplitudes as the tools for quantitative reasoning, and the third is an interpretative rule that provides the link to the prediction of experimental outcomes. In this work we introduce a natural and compelling fourth assumption: if there is no reason to prefer one region of the configuration space (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  28.  71
    Note on Entropies of Quantum Dynamical Systems.Noboru Watanabe - 2011 - Foundations of Physics 41 (3):549-563.
    We review some techniques and notions for quantum information theory. It is shown that the dynamical entropies is discussed and some numerical computations of these entropies are carried for several states.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  29. Algorithmic entropy of sets.J. T. Schwartz - unknown
    In a previous paper a theory of program size formally identical to information theory was developed. The entropy of an individual finite object was defined to be the size in bits of the smallest program for calculating it. It was shown that this is − log2 of the probability that the object is obtained by means of a program whose successive bits are chosen by flipping an unbiased coin. Here a theory of the entropy (...)
     
    Export citation  
     
    Bookmark  
  30.  28
    Information-Theoretic Measures Predict the Human Judgment of Rhythm Complexity.Remi Fleurian, Tim Blackwell, Oded Ben‐Tal & Daniel Müllensiefen - 2017 - Cognitive Science 41 (3):800-813.
    To formalize the human judgment of rhythm complexity, we used five measures from information theory and algorithmic complexity to measure the complexity of 48 artificially generated rhythmic sequences. We compared these measurements to human prediction accuracy and easiness judgments obtained from a listening experiment, in which 32 participants guessed the last beat of each sequence. We also investigated the modulating effects of musical expertise and general pattern identification ability. Entropy rate and Kolmogorov complexity were correlated with prediction (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  31.  62
    Information Invariance and Quantum Probabilities.Časlav Brukner & Anton Zeilinger - 2009 - Foundations of Physics 39 (7):677-689.
    We consider probabilistic theories in which the most elementary system, a two-dimensional system, contains one bit of information. The bit is assumed to be contained in any complete set of mutually complementary measurements. The requirement of invariance of the information under a continuous change of the set of mutually complementary measurements uniquely singles out a measure of information, which is quadratic in probabilities. The assumption which gives the same scaling of the number of degrees of freedom with (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   43 citations  
  32. Degeneration and Entropy.Eugene Y. S. Chua - 2022 - Kriterion - Journal of Philosophy 36 (2):123-155.
    [Accepted for publication in Lakatos's Undone Work: The Practical Turn and the Division of Philosophy of Mathematics and Philosophy of Science, special issue of Kriterion: Journal of Philosophy. Edited by S. Nagler, H. Pilin, and D. Sarikaya.] Lakatos’s analysis of progress and degeneration in the Methodology of Scientific Research Programmes is well-known. Less known, however, are his thoughts on degeneration in Proofs and Refutations. I propose and motivate two new criteria for degeneration based on the discussion in Proofs and Refutations (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  33. Maxwell's demon and the entropy cost of information.Paul N. Fahn - 1996 - Foundations of Physics 26 (1):71-93.
    We present an analysis of Szilard's one-molecule Maxwell's demon, including a detailed entropy accounting, that suggests a general theory of the entropy cost of information. It is shown that the entropy of the demon increases during the expansion step, due to the decoupling of the molecule from the measurement information. It is also shown that there is an entropy symmetry between the measurement and erasure steps, whereby the two steps additivelv share a constant (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  34.  37
    An Information Ethics Theory in the Context of Information Philosophy.Nesibe Kantar - 2022 - Entelekya Logico-Metaphysical Review 6 (1):37-45.
    Like all other inventions, advances in the field of digital computational technologies, which we will briefly describe as the information world, have also played an essential role in humanity life. These advances have brought some ethical debates to our individual and social life, as well as the industrial benefit obtained by the digital and analog technological developments that positively or negatively affect and transform all economic and cultural paradigms surrounding human life. The branch of the philosophy of information, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  35.  83
    Are black holes about information?Christian Wuthrich - unknown
    Information theory presupposes the notion of an epistemic agent, such as a scientist or an idealized human. Despite that, information theory is increasingly invoked by physicists concerned with fundamental physics, physics at very high energies, or generally with the physics of situations in which even idealized epistemic agents cannot exist. In this paper, I shall try to determine the extent to which the application of information theory in those contexts is legitimate. I will illustrate (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   14 citations  
  36.  11
    Affirming Entropy.Ashley Woodward - 2024 - Technophany 2 (1).
    This paper challenges the frequent demonisation of entropy in discourses which attempt to draw a “naturalised” axiology from thermodynamics, information theory, and related sciences. Such discourses include Wiener’s cybernetics, Stiegler’s negathropology, and Floridi’s information ethics, in each of which entropy designates the evil which must be fought in the name of life, information, or some other notion of “the good.” The perspective the paper develops is Nietzschean. Nietzsche himself rejected the consequences of the Second (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  37.  23
    Information Processing: The Language and Analytical Tools for Cognitive Psychology in the Information Age.Aiping Xiong & Robert W. Proctor - 2018 - Frontiers in Psychology 9:362645.
    The information age can be dated to the work of Norbert Wiener and Claude Shannon in the 1940s. Their work on cybernetics and information theory, and many subsequent developments, had a profound influence on reshaping the field of psychology from what it was prior to the 1950s. Contemporaneously, advances also occurred in experimental design and inferential statistical testing stemming from the work of Ronald Fisher, Jerzy Neyman, and Egon Pearson. These interdisciplinary advances from outside of psychology provided (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  38.  14
    Entropy of eye movement during rapid automatized naming.Hongan Wang, Fulin Liu, Yuhong Dong & Dongchuan Yu - 2022 - Frontiers in Human Neuroscience 16.
    Numerous studies have focused on the understanding of rapid automatized naming, which can be applied to predict reading abilities and developmental dyslexia in children. Eye tracking technique, characterizing the essential ocular activities, might have the feasibility to reveal the visual and cognitive features of RAN. However, traditional measures of eye movements ignore many dynamical details about the visual and cognitive processing of RAN, and are usually associated with the duration of time spent on some particular areas of interest, fixation counts, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  39.  19
    Entropy and Entropic Differences in the Work of Michel Serres.Lilian Kroth - 2024 - Theory, Culture and Society 41 (2):21-35.
    Michel Serres’s philosophy of entropy takes what he famously calls the ‘Northwest Passage’ between the sciences and the humanities. By contextualizing his approach to entropy and affirming the role of a philosophy of difference, this paper explores Serres’s approach by means of ‘entropic differences’. It claims that entropy – or rather, entropies – provide Serres with a paradigmatic case for critical translations between different domains of knowledge. From his early Hermès series, through to The Birth of Physics (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  40.  70
    Entropy in Relation to Incomplete Knowledge.Michael J. Zenzen - 1985 - Cambridge University Press.
    This book is about an important issue which has arisen within two of the branches of physical science - namely thermodynamics and statistical mechanics - where the notion of entropy plays an essential role. A number of scientists and information theorists have maintained that entropy is a subjective concept and is a measure of human ignorance. Such a view, if it is valid, would create some profound philosophical problems and would tend to undermine the objectivity of the (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  41. A Critical Analysis of Floridi’s Theory of Semantic Information.Pieter Adriaans - 2010 - Knowledge, Technology & Policy 23 (1):41-56.
    In various publications over the past years, Floridi has developed a theory of semantic information as well-formed, meaningful, and truthful data. This theory is more or less orthogonal to the standard entropy-based notions of information known from physics, information theory, and computer science that all define the amount of information in a certain system as a scalar value without any direct semantic implication. In this context the question rises what the exact relation (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   17 citations  
  42. Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy.Massimiliano Badino - forthcoming - Isonomía. Revista de Teoría y Filosofía Del Derecho.
    The Kolmogorov-Sinai entropy is a fairly exotic mathematical concept which has recently aroused some interest on the philosophers’ part. The most salient trait of this concept is its working as a junction between such diverse ambits as statistical mechanics, information theory and algorithm theory. In this paper I argue that, in order to understand this very special feature of the Kolmogorov-Sinai entropy, is essential to reconstruct its genealogy. Somewhat surprisingly, this story takes us as far (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  43.  27
    On Entropy of Quantum Compound Systems.Noboru Watanabe - 2015 - Foundations of Physics 45 (10):1311-1329.
    We review some notions for general quantum entropies. The entropy of the compound systems is discussed and a numerical computation of the quantum dynamical systems is carried for the noisy optical channel.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  44. On Classical and Quantum Logical Entropy.David Ellerman - manuscript
    The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  45. Rigorous information-theoretic derivation of quantum-statistical thermodynamics. II.William Band & James L. Park - 1977 - Foundations of Physics 7 (9-10):705-721.
    Part I of the present work outlined the rigorous application of information theory to a quantum mechanical system in a thermodynamic equilibrium state. The general formula developed there for the best-guess density operator $\hat \rho$ was indeterminate because it involved in an essential way an unspecified prior probability distribution over the continuumD H of strong equilibrium density operators. In Part II mathematical evaluation of $\hat \rho$ is completed after an epistemological analysis which leads first to the discretization ofD (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  46. The concept on entropy in communication.Yeram Sarkis Touloukian - 1956 - Lafayette, Ind.,: Purdue University.
  47. Entropy and uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
    This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   55 citations  
  48. From Art to Information System.Miro Brada - 2021 - AGI Laboratory.
    This insight to art came from chess composition concentrating art in a very dense form. To identify and mathematically assess the uniqueness is the key applicable to other areas eg. computer programming. Maximization of uniqueness is minimization of entropy that coincides as well as goes beyond Information Theory (Shannon, 1948). The reusage of logic as a universal principle to minimize entropy, requires simplified architecture and abstraction. Any structures (e.g. plugins) duplicating or dividing functionality increase entropy (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  49.  81
    Information and Inference.Jaakko Hintikka - 1970 - D. Reidel.
    In the last 25 years, the concept of information has played a crucial role in communication theory, so much so that the terms information theory and communication theory are sometimes used almost interchangeably. It seems to us, however, that the notion of information is also destined to render valuable services to the student of induction and probability, of learning and reinforcement, of semantic meaning and deductive inference, as~well as of scientific method in general. The (...)
  50.  72
    Limited role of entropy in information economics.Jacob Marschak - 1974 - Theory and Decision 5 (1):1-7.
1 — 50 / 190