Results for ' Entropy'

984 found
Order:
See also
  1. Abner Shimony.Carnap On Entropy - 1975 - In Jaakko Hintikka (ed.), Rudolf Carnap, logical empiricist: materials and perspectives. Boston: D. Reidel Pub. Co.. pp. 381.
     
    Export citation  
     
    Bookmark  
  2. Entropy in evolution.John Collier - 1986 - Biology and Philosophy 1 (1):5-24.
    Daniel R. Brooks and E. O. Wiley have proposed a theory of evolution in which fitness is merely a rate determining factor. Evolution is driven by non-equilibrium processes which increase the entropy and information content of species together. Evolution can occur without environmental selection, since increased complexity and organization result from the likely capture at the species level of random variations produced at the chemical level. Speciation can occur as the result of variation within the species which decreases the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   24 citations  
  3.  18
    Entropy and Entropic Differences in the Work of Michel Serres.Lilian Kroth - 2024 - Theory, Culture and Society 41 (2):21-35.
    Michel Serres’s philosophy of entropy takes what he famously calls the ‘Northwest Passage’ between the sciences and the humanities. By contextualizing his approach to entropy and affirming the role of a philosophy of difference, this paper explores Serres’s approach by means of ‘entropic differences’. It claims that entropy – or rather, entropies – provide Serres with a paradigmatic case for critical translations between different domains of knowledge. From his early Hermès series, through to The Birth of Physics (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  4. Characterizing Entropy in Statistical Physics and in Quantum Information Theory.Bernhard Baumgartner - 2014 - Foundations of Physics 44 (10):1107-1123.
    A new axiomatic characterization with a minimum of conditions for entropy as a function on the set of states in quantum mechanics is presented. Traditionally unspoken assumptions are unveiled and replaced by proven consequences of the axioms. First the Boltzmann–Planck formula is derived. Building on this formula, using the Law of Large Numbers—a basic theorem of probability theory—the von Neumann formula is deduced. Axioms used in older theories on the foundations are now derived facts.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  5.  88
    Informational entropy-based value formation: A new paradigm for a deeper understanding of value.Quan-Hoang Vuong, Viet-Phuong La & Minh-Hoang Nguyen - 2025 - I.E.V.25.
    The major global challenges of our time, like climate and environmental crises, rising inequality, the emergence of disruptive technologies, etc., demand interdisciplinary research for effective solutions. A clear understanding of value is essential for guiding socio-cultural and economic transitions to address these issues. Despite numerous attempts to define value, existing approaches remain inconsistent across disciplines and lack a comprehensive framework. This paper introduces a novel perspective on value through the lens of granular interaction thinking theory, proposing an informational entropy-based (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  6. Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford, GB: Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   37 citations  
  7.  23
    Entropy: Into the Greenhouse World.Jeremy Rifkin & Ted Howard - 1989 - Bantam.
    For the first time Entropy has been completely revised and updated to include a new subtitle which reflects the expanded focus on the greenhouse effect--the largest crisis ever to face mankind.
    Direct download  
     
    Export citation  
     
    Bookmark   5 citations  
  8.  39
    Deformed Entropy and Information Relations for Composite and Noncomposite Systems.Vladimir N. Chernega, Olga V. Man’ko & Vladimir I. Man’ko - 2015 - Foundations of Physics 45 (7):783-798.
    The notion of conditional entropy is extended to noncomposite systems. The \-deformed entropic inequalities, which usually are associated with correlations of the subsystem degrees of freedom in bipartite systems, are found for the noncomposite systems. New entropic inequalities for quantum tomograms of qudit states including the single qudit states are obtained. The Araki–Lieb inequality is found for systems without subsystems.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  9.  85
    Entropy of knowledge.Hillary Jay Kelley - 1969 - Philosophy of Science 36 (2):178-196.
    Entropy is proposed as a concept which in its broader scope can contribute to the study of the General Information System. This paper attempts to identify a few fundamental subconcepts and LEMMAS which will serve to facilitate further study of system order. The paper discusses: partitioning order into logical and arbitrary kinds; the relationship of order to pattern; and suggested approaches to evaluating and improving the General Information System.
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  10.  21
    Entropy, Free Energy, and Symbolization: Free Association at the Intersection of Psychoanalysis and Neuroscience.Thomas Rabeyron & Claudie Massicotte - 2020 - Frontiers in Psychology 11:507186.
    Both a method of therapy and an exploration of psychic reality, free association is a fundamental element of psychoanalytical practices that refers to the way a patient is asked to describe what comes spontaneously to mind in the therapeutic setting. This paper examines the role of free association from the point of view of psychoanalysis and neuroscience in order to improve our understanding of therapeutic effects induced by psychoanalytic therapies and psychoanalysis. In this regard, we first propose a global overview (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  11.  24
    Entropy and Art: An Essay on Disorder and Order.Rudolf Arnheim - 1971 - Berkeley,: University of California Press.
    This essay is an attempt to reconcile the disturbing contradiction between the striving for order in nature and in man and the principle of entropy implicit in the second law of thermodynamics - between the tendency toward greater organization and the general trend of the material universe toward death and disorder.
    Direct download  
     
    Export citation  
     
    Bookmark   6 citations  
  12.  68
    Entropy in Relation to Incomplete Knowledge.Michael J. Zenzen - 1985 - Cambridge University Press.
    This book is about an important issue which has arisen within two of the branches of physical science - namely thermodynamics and statistical mechanics - where the notion of entropy plays an essential role. A number of scientists and information theorists have maintained that entropy is a subjective concept and is a measure of human ignorance. Such a view, if it is valid, would create some profound philosophical problems and would tend to undermine the objectivity of the scientific (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  13.  18
    Brain Entropy During Aging Through a Free Energy Principle Approach.Filippo Cieri, Xiaowei Zhuang, Jessica Z. K. Caldwell & Dietmar Cordes - 2021 - Frontiers in Human Neuroscience 15.
    Neural complexity and brain entropy have gained greater interest in recent years. The dynamics of neural signals and their relations with information processing continue to be investigated through different measures in a variety of noteworthy studies. The BEN of spontaneous neural activity decreases during states of reduced consciousness. This evidence has been showed in primary consciousness states, such as psychedelic states, under the name of “the entropic brain hypothesis.” In this manuscript we propose an extension of this hypothesis to (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  14.  96
    Entropy of formulas.Vera Koponen - 2009 - Archive for Mathematical Logic 48 (6):515-522.
    A probability distribution can be given to the set of isomorphism classes of models with universe {1, ..., n} of a sentence in first-order logic. We study the entropy of this distribution and derive a result from the 0–1 law for first-order sentences.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  15. Is entropy relevant to the asymmetry between retrodiction and prediction?Martin Barrett & Elliott Sober - 1992 - British Journal for the Philosophy of Science 43 (2):141-160.
    The idea that the changing entropy of a system is relevant to explaining why we know more about the system's past than about its future has been criticized on several fronts. This paper assesses the criticisms and clarifies the epistemology of the inference problem. It deploys a Markov process model to investigate the relationship between entropy and temporally asymmetric inference.
    Direct download (10 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  16. Entropy and uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
    This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a sensitivity (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   55 citations  
  17.  41
    Understanding entropy.Peter G. Nelson - 2021 - Foundations of Chemistry 24 (1):3-13.
    A new way of understanding entropy as a macroscopic property is presented. This is based on the fact that heat flows from a hot body to a cold one even when the hot one is smaller and has less energy. A quantity that determines the direction of flow is shown to be the increment of heat gained divided by the absolute temperature. The same quantity is shown to determine the direction of other processes taking place in isolated systems provided (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  18.  90
    Entropy and Art: An Essay on Disorder and Order.Rudolf Arnheim - 1973 - Journal of Aesthetics and Art Criticism 32 (2):280-281.
    This essay is an attempt to reconcile the disturbing contradiction between the striving for order in nature and in man and the principle of entropy implicit in the second law of thermodynamics - between the tendency toward greater organization and the general trend of the material universe toward death and disorder.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  19.  52
    Physical entropy and the senses.Kenneth H. Norwich - 2005 - Acta Biotheoretica 53 (3):167-180.
    With reference to two specific modalities of sensation, the taste of saltiness of chloride salts, and the loudness of steady tones, it is shown that the laws of sensation (logarithmic and power laws) are expressions of the entropy per mole of the stimulus. That is, the laws of sensation are linear functions of molar entropy. In partial verification of this hypothesis, we are able to derive an approximate value for the gas constant, a fundamental physical constant, directly from (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  20.  22
    Information Entropy Based on Propagation Feature of Node for Identifying the Influential Nodes.Linfeng Zhong, Yu Bai, Yan Tian, Chen Luo, Jin Huang & Weijun Pan - 2021 - Complexity 2021:1-8.
    For understanding and controlling spreading in complex networks, identifying the most influential nodes, which can be applied to disease control, viral marketing, air traffic control, and many other fields, is of great importance. By taking the effect of the spreading rate on information entropy into account, we proposed an improved information entropy method. Compared to the benchmark methods in the six different empirical networks, the IIE method has been found with a better performance on Kendall’s Tau and imprecision (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  21.  38
    Entropy and sign conventions.G. M. Anderson - 2023 - Foundations of Chemistry 25 (1):119-125.
    It is a fundamental cornerstone of thermodynamics that entropy (SU,V\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$S_{U,V}$$\end{document}) increases in spontaneous processes in isolated systems (often called closed or thermally closed systems when the transfer of energy as work is considered to be negligible) and achieves a maximum when the system reaches equilibrium. But with a different sign convention entropy could just as well be said to decrease to a minimum in spontaneous constant U, V processes. It (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  22.  30
    (1 other version)Entropies and the Anthropocene crisis.Maël Montévil - 2021 - AI and Society:1-21.
    The Anthropocene crisis is frequently described as the rarefaction of resources or resources per capita. However, both energy and minerals correspond to fundamentally conserved quantities from the perspective of physics. A specific concept is required to understand the rarefaction of available resources. This concept, entropy, pertains to energy and matter configurations and not just to their sheer amount. However, the physics concept of entropy is insufficient to understand biological and social organizations. Biological phenomena display both historicity and systemic (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  23.  14
    Entropy of eye movement during rapid automatized naming.Hongan Wang, Fulin Liu, Yuhong Dong & Dongchuan Yu - 2022 - Frontiers in Human Neuroscience 16.
    Numerous studies have focused on the understanding of rapid automatized naming, which can be applied to predict reading abilities and developmental dyslexia in children. Eye tracking technique, characterizing the essential ocular activities, might have the feasibility to reveal the visual and cognitive features of RAN. However, traditional measures of eye movements ignore many dynamical details about the visual and cognitive processing of RAN, and are usually associated with the duration of time spent on some particular areas of interest, fixation counts, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  24.  74
    Entropy and evil.Robert John Russell - 1984 - Zygon 19 (4):449-468.
    This paper explores a possible relationship between entropy and evil in terms of metaphor. After presenting the various meanings of entropy in classical thermodynamics and statical mechanics, and the Augustinian and Irenaean theodicies, several similarities and dissimilarities between entropy and evil are described. Underlying the concepts of evil and entropy is the assumption that time has a direction. After examining the scientific basis for this assumption, it is hypothesized that, if evil is real in nature, (...) is what one would expect to find at the level of physical processes, and conversely that, if entropy is coupled to a physical arrow of time, one could expect to find dissipative yet catalytic processes in history and religious experience. (shrink)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  25. Logical Entropy: Introduction to Classical and Quantum Logical Information theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  26.  36
    (1 other version)Entropy: a new world view.Jeremy Rifkin - 1980 - New York: Viking Press. Edited by Ted Howard.
  27. Maximum Shannon Entropy, Minimum Fisher Information, and an Elementary Game.Shunlong Luo - 2002 - Foundations of Physics 32 (11):1757-1772.
    We formulate an elementary statistical game which captures the essence of some fundamental quantum experiments such as photon polarization and spin measurement. We explore and compare the significance of the principle of maximum Shannon entropy and the principle of minimum Fisher information in solving such a game. The solution based on the principle of minimum Fisher information coincides with the solution based on an invariance principle, and provides an informational explanation of Malus' law for photon polarization. There is no (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  28. Horizon Entropy.Ted Jacobson & Renaud Parentani - 2003 - Foundations of Physics 33 (2):323-348.
    Although the laws of thermodynamics are well established for black hole horizons, much less has been said in the literature to support the extension of these laws to more general settings such as an asymptotic de Sitter horizon or a Rindler horizon (the event horizon of an asymptotic uniformly accelerated observer). In the present paper we review the results that have been previously established and argue that the laws of black hole thermodynamics, as well as their underlying statistical mechanical content, (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  29.  83
    Quantum Mutual Entropy Defined by Liftings.Satoshi Iriyama & Masanori Ohya - 2011 - Foundations of Physics 41 (3):406-413.
    A lifting is a map from the state of a system to that of a compound system, which was introduced in Accardi and Ohya (Appl. Math. Optim. 39:33–59, 1999). The lifting can be applied to various physical processes.In this paper, we defined a quantum mutual entropy by the lifting. The usual quantum mutual entropy satisfies the Shannon inequality (Ohya in IEEE Trans. Inf. Theory 29(5):770–774, 1983), but the mutual entropy defined through the lifting does not satisfy this (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  30.  26
    On Entropy of Quantum Compound Systems.Noboru Watanabe - 2015 - Foundations of Physics 45 (10):1311-1329.
    We review some notions for general quantum entropies. The entropy of the compound systems is discussed and a numerical computation of the quantum dynamical systems is carried for the noisy optical channel.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  31.  42
    Entropy in operational statistics and quantum logic.Carl A. Hein - 1979 - Foundations of Physics 9 (9-10):751-786.
    In a series of recent papers, Randall and Foulis have developed a generalized theory of probability (operational statistics) which is based on the notion of a physical operation. They have shown that the quantum logic description of quantum mechanics can be naturally imbedded into this generalized theory of probability. In this paper we shall investigate the role of entropy (in the sense of Shannon's theory of information) in operational statistics. We shall find that there are several related entropy (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  32.  16
    Entropy as the main justification for research in medical ethics.Marie-France Mamzer, Christophe Tresallet, Louis Pantel & Alban Zarzavadjian Le Bian - 2022 - Philosophy, Ethics, and Humanities in Medicine 17 (1):1-2.
    Ethics is an unconventional field of research for a surgeon, as ethics in surgery owns several specificities and surgery is considered an aggressive specialty. Therefore, the interest of research in medical ethics is sometimes unclear.In this short essay, we discussed the interest of research in medical ethics using a comparison to thermodynamics and mainly, entropy. During the transformation of a figure from one state to another, some energy is released or absorbed; yet, a part of this energy is wasted (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  33.  49
    Computing the topological entropy of shifts.Christoph Spandl - 2007 - Mathematical Logic Quarterly 53 (4):493-510.
    Different characterizations of classes of shift dynamical systems via labeled digraphs, languages, and sets of forbidden words are investigated. The corresponding naming systems are analyzed according to reducibility and particularly with regard to the computability of the topological entropy relative to the presented naming systems. It turns out that all examined natural representations separate into two equivalence classes and that the topological entropy is not computable in general with respect to the defined natural representations. However, if a specific (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  34.  28
    Entropy and Evolution.J. Johnstone - 1932 - Philosophy 7 (27):287 - 298.
    The conception of entropy came from the theoretical study of the steam-engine, and, neglecting refinements of mathematical treatment, can easily be illustrated. The motive power of the engine comes from the transformation of the heat generated by the combustion of the fuel. The mechanism transforms the kinetic energy of the molecules of heated steam into the kinetic energy of the parts of the engine. Consider a steamship under way: for a brief period of time this can be regarded as (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  35.  33
    Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  36. Choosing a Definition of Entropy that Works.Robert H. Swendsen - 2012 - Foundations of Physics 42 (4):582-593.
    Disagreements over the meaning of the thermodynamic entropy and how it should be defined in statistical mechanics have endured for well over a century. In an earlier paper, I showed that there were at least nine essential properties of entropy that are still under dispute among experts. In this paper, I examine the consequences of differing definitions of the thermodynamic entropy of macroscopic systems.Two proposed definitions of entropy in classical statistical mechanics are (1) defining entropy (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   35 citations  
  37. Information, entropy and inductive logic.S. Pakswer - 1954 - Philosophy of Science 21 (3):254-259.
    It has been shown by several authors that in operations involving information a quantity appears which is the negative of the quantity usually defined as entropy in similar situations. This quantity ℜ = − KI has been termed “negentropy” and it has been shown that the negentropy of information and the physical entropy S are mirrorlike representations of the same train of events. In physical terminology the energy is degraded by an increase in entropy due to an (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark  
  38.  46
    Economics, Entropy and the Long Term Future: Conceptual Foundations and the Perspective of the Economics of Survival.Charles C. Mueller - 2001 - Environmental Values 10 (3):361-384.
    The present paper is a survey of the economics of survival, a branch of ecological economics that stresses the preservation of the opportunities of future generations over an extended time horizon. It outlines the main analytical foundation of the branch - in which the concept of entropy is a major building block -, and its analysis of the interaction between the economic system and the environment. Regarding its outlook of the future, we see that the founders of the branch (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  39. How does the Entropy/information Bound Work?Jacob D. Bekenstein - 2005 - Foundations of Physics 35 (11):1805-1823.
    According to the universal entropy bound, the entropy (and hence information capacity) of a complete weakly self-gravitating physical system can be bounded exclusively in terms of its circumscribing radius and total gravitating energy. The bound’s correctness is supported by explicit statistical calculations of entropy, gedanken experiments involving the generalized second law, and Bousso’s covariant holographic bound. On the other hand, it is not always obvious in a particular example how the system avoids having too many states for (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  40.  81
    Entropy-Driven Global Best Selection in Particle Swarm Optimization for Many-Objective Software Package Restructuring.Amarjeet Prajapati, Anshu Parashar, Undefined Sunita & Alok Mishra - 2021 - Complexity 2021:1-11.
    Many real-world optimization problems usually require a large number of conflicting objectives to be optimized simultaneously to obtain solution. It has been observed that these kinds of many-objective optimization problems often pose several performance challenges to the traditional multi-objective optimization algorithms. To address the performance issue caused by the different types of MaOPs, recently, a variety of many-objective particle swarm optimization has been proposed. However, external archive maintenance and selection of leaders for designing the MaOPSO to real-world MaOPs are still (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  41. Further on informational quanta, interactions, and entropy under the granular view of value formation.Quan-Hoang Vuong & Minh-Hoang Nguyen - 2024 - SSRN.
    A recent study suggests that value and quantum states seem to be governed by the same underlying mechanisms. In our recent book titled "Better economics for the Earth: A lesson from quantum and information theories," specifically Chapter 5, we have proposed an informational entropy-based notion of value, drawing on the granular worldview and primary features of quantum mechanics, Shannon’s information theory, and the mindsponge theory. Specifically, the notion suggests that values are created through the interactions of information. But how (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  42. Entropy and Vacuum Radiation.Jean E. Burns - 1998 - Foundations of Physics 28 (7):1191-1207.
    It is shown that entropy increase in thermodynamic systems can plausibly be accounted for by the random action of vacuum radiation. A recent calculation by Rueda using stochastic electrodynamics (SED) shows that vacuum radiation causes a particle to undergo a rapid Brownian motion about its average dynamical trajectory. It is shown that the magnitude of spatial drift calculated by Rueda can also be predicted by assuming that the average magnitudes of random shifts in position and momentum of a particle (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  43. Boltzmann entropy for dense fluids not in local equilibrium.Sheldon Goldstein - manuscript
    Using computer simulations, we investigate the time evolution of the (Boltzmann) entropy of a dense fluid not in local equilibrium. The macrovariables M describing the system are the (empirical) particle density f = {f(x,v)} and the total energy E. We find that S(ft,E) is a monotone increasing in time even when its kinetic part is decreasing. We argue that for isolated Hamiltonian systems monotonicity of S(Mt) = S(MXt) should hold generally for ‘‘typical’’ (the overwhelming majority of) initial microstates (phase (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  44.  44
    On Entropy Production in the Madelung Fluid and the Role of Bohm’s Potential in Classical Diffusion.Eyal Heifetz, Roumen Tsekov, Eliahu Cohen & Zohar Nussinov - 2016 - Foundations of Physics 46 (7):815-824.
    The Madelung equations map the non-relativistic time-dependent Schrödinger equation into hydrodynamic equations of a virtual fluid. While the von Neumann entropy remains constant, we demonstrate that an increase of the Shannon entropy, associated with this Madelung fluid, is proportional to the expectation value of its velocity divergence. Hence, the Shannon entropy may grow due to an expansion of the Madelung fluid. These effects result from the interference between solutions of the Schrödinger equation. Growth of the Shannon (...) due to expansion is common in diffusive processes. However, in the latter the process is irreversible while the processes in the Madelung fluid are always reversible. The relations between interference, compressibility and variation of the Shannon entropy are then examined in several simple examples. Furthermore, we demonstrate that for classical diffusive processes, the “force” accelerating diffusion has the form of the positive gradient of the quantum Bohm potential. Expressing then the diffusion coefficient in terms of the Planck constant reveals the lower bound given by the Heisenberg uncertainty principle in terms of the product between the gas mean free path and the Brownian momentum. (shrink)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  45.  75
    Striving, entropy, and meaning.J. S. Russell - 2020 - Journal of the Philosophy of Sport 47 (3):419-437.
    ABSTRACT This paper argues that striving is a cardinal virtue in sport and life. It is an overlooked virtue that is an important component of human happiness and a source of a sense of dignity. The human psychological capacity for striving emerged as a trait for addressing the entropic features of our existence, but it can be engaged and used for other purposes. Sport is one such example. Sport appears exceptional in being designed specifically to test and display our capacities (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  46. Entropy and the Direction of Time.Jerzy Gołosz - 2021 - Entropy 23 (4):388.
    The paper tries to demonstrate that the process of the increase of entropy does not explain the asymmetry of time itself because it is unable to account for its fundamental asymmetries, that is, the asymmetry of traces (we have traces of the past and no traces of the future), the asymmetry of causation (we have an impact on future events with no possibility of having an impact on the past), and the asymmetry between the fixed past and the open (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  47. Algorithmic entropy of sets.J. T. Schwartz - unknown
    In a previous paper a theory of program size formally identical to information theory was developed. The entropy of an individual finite object was defined to be the size in bits of the smallest program for calculating it. It was shown that this is − log2 of the probability that the object is obtained by means of a program whose successive bits are chosen by flipping an unbiased coin. Here a theory of the entropy of recursively enumerable sets (...)
     
    Export citation  
     
    Bookmark  
  48.  17
    Increasing entropy reduces perceived numerosity throughout the lifespan.Chuyan Qu, Nicholas K. DeWind & Elizabeth M. Brannon - 2022 - Cognition 225 (C):105096.
  49.  7
    Evolution, Entropie und Reizsamkeit. Naturwissenschaftliche Kategorien im Lamprecht-Streit.Thomas Mergel - 2003 - In Dagmar Stegmüller, Christian Mehr & Ulrich Muhlack (eds.), Historisierung Und Gesellschaftlicher Wandel in Deutschland Im 19. Jahrhundert. Akademie Verlag. pp. 211-228.
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  50. Probability Description and Entropy of Classical and Quantum Systems.Margarita A. Man’ko & Vladimir I. Man’ko - 2011 - Foundations of Physics 41 (3):330-344.
    Tomographic approach to describing both the states in classical statistical mechanics and the states in quantum mechanics using the fair probability distributions is reviewed. The entropy associated with the probability distribution (tomographic entropy) for classical and quantum systems is studied. The experimental possibility to check the inequalities like the position–momentum uncertainty relations and entropic uncertainty relations are considered.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
1 — 50 / 984