Results for 'maximum entropy production'

974 found
Order:
  1.  39
    Maximum power and maximum entropy production: finalities in nature.Stanley Salthe - 2010 - Cosmos and History 6 (1):114-121.
    I begin with the definition of power, and find that it is finalistic inasmuch as work directs energy dissipation in the interests of some system. The maximum power principle of Lotka and Odum implies an optimal energy efficiency for any work; optima are also finalities. I advance a statement of the maximum entropy production principle, suggesting that most work of dissipative structures is carried out at rates entailing energy flows faster than those that would associate with (...)
    Direct download  
     
    Export citation  
     
    Bookmark   6 citations  
  2.  38
    Is the maximum entropy production just a heuristic principle? Metaphysics on natural determination.Javier Sánchez-Cañizares - 2023 - Synthese 201 (4):1-15.
    The Maximum Entropy Production Principle (MEPP) stands out as an overarching principle that rules life phenomena in Nature. However, its explanatory power beyond heuristics remains controversial. On the one hand, the MEPP has been successfully applied principally to non-living systems far from thermodynamic equilibrium. On the other hand, the underlying assumptions to lay the MEPP’s theoretical foundations and range of applicability increase the possibilities of conflicting interpretations. More interestingly, from a metaphysical stance, the MEPP’s philosophical status is (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  3.  16
    Noise induced phase transition between maximum entropy production structures and minimum entropy production structures?Alfred Hubler, Andrey Belkin & Alexey Bezryadin - 2015 - Complexity 20 (3):8-11.
  4.  58
    Predictive Statistical Mechanics and Macroscopic Time Evolution: Hydrodynamics and Entropy Production.Domagoj Kuić - 2016 - Foundations of Physics 46 (7):891-914.
    In the previous papers, it was demonstrated that applying the principle of maximum information entropy by maximizing the conditional information entropy, subject to the constraint given by the Liouville equation averaged over the phase space, leads to a definition of the rate of entropy change for closed Hamiltonian systems without any additional assumptions. Here, we generalize this basic model and, with the introduction of the additional constraints which are equivalent to the hydrodynamic continuity equations, show that (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  5.  96
    Quantum Model of Classical Mechanics: Maximum Entropy Packets. [REVIEW]P. Hájíček - 2009 - Foundations of Physics 39 (9):1072-1096.
    In a previous paper, a statistical method of constructing quantum models of classical properties has been described. The present paper concludes the description by turning to classical mechanics. The quantum states that maximize entropy for given averages and variances of coordinates and momenta are called ME packets. They generalize the Gaussian wave packets. A non-trivial extension of the partition-function method of probability calculus to quantum mechanics is given. Non-commutativity of quantum variables limits its usefulness. Still, the general form of (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  6.  38
    Beyond fatalism: Gaia, entropy, and the autonomy of anthropogenic life on Earth.Alejandro Merlo & Xabier E. Barandiaran - 2024 - Ethics in Science and Environmental Politics 24:61-75.
    The current disruption of ecosystems and climate systems can be likened to an increase in entropy within our planet. This concept is often linked to the second law of thermodynamics, which predicts a necessary rise in entropy resulting from all material and energy-related processes, including the intricate organisation of living systems. Consequently, discussions surrounding the ongoing crisis commonly carry an underlying sense of fatalism when referencing thermodynamic principles. In this study, we explore how the understanding of life has (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  7. The Universe Consists of Charge and Entropy: Mind and Matter Are Negative Entropy.James A. Morris - 2025 - Open Journal of Philosophy 15 (1):64-76.
    Information is the reduction of uncertainty, and uncertainty is equivalent to entropy. Thus, information is negative entropy. Concepts from information theory can be used to analyse physiological and psychological processes and to probe the nature of abstract ideas such as consciousness and vitality. A previously published model of the Universe indicates that it consists of charge and entropy. Since entropy is energy, then charge and energy are enough to explain the whole of nature. The building blocks (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  8.  75
    Selection Is Entailed by Self-Organization and Natural Selection Is a Special Case.Rod Swenson - 2010 - Biological Theory 5 (2):167-181.
    In their book, Darwinism Evolving: Systems Dynamics and the Genealogy of Natural Selection, Depew and Weber argued for the need to address the relationship between self-organization and natural selection in evolutionary theory, and focused on seven “visions” for doing so. Recently, Batten et al. in a paper in this journal, entitled “Visions of evolution: self-organization proposes what natural selection disposes,” picked up the issue with the work of Depew and Weber as a starting point. While the efforts of both sets (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  9. On the naturalisation of teleology: self-organisation, autopoiesis and teleodynamics.Miguel Garcia-Valdecasas - 2022 - Adaptive Behavior 30 (2):103-117.
    In recent decades, several theories have claimed to explain the teleological causality of organisms as a function of self-organising and self-producing processes. The most widely cited theories of this sort are variations of autopoiesis, originally introduced by Maturana and Varela. More recent modifications of autopoietic theory have focused on system organisation, closure of constraints and autonomy to account for organism teleology. This article argues that the treatment of teleology in autopoiesis and other organisation theories is inconclusive for three reasons: First, (...)
    Direct download  
     
    Export citation  
     
    Bookmark   4 citations  
  10. Spirit.Eric Steinhart - 2017 - Sophia 56 (4):557-571.
    Many religions and religious philosophies say that ultimate reality is a kind of primal energy. This energy is often described as a vital power animating living things, as a spiritual force directing the organization of matter, or as a divine creative power which generates all things. By refuting older conceptions of primal energy, modern science opens the door to new and more precise conceptions. Primal energy is referred to here as ‘spirit’. But spirit is a natural power. A naturalistic theory (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  11.  75
    In search of a reconciliation between semiotics, thermodynamics and metasystem transition theory.Vefa Karatay & Yagmur Denizhan - 2005 - Axiomathes 15 (1):47-61.
    The disciplines of cybernetics, semiotics and thermodynamics investigate evolutionary processes quite independently from each other. The aim of this paper is to draw the parallels and point out the possibility and necessity of a reconciliation between these disciplines. The concept of metasystem transition has been proposed by Turchin as a quantum of evolution from a cybernetic point of view. Semiotic processes are of prime importance for the realisation of metasystem transitions in the course of evolution. From a thermodynamic point of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  12.  40
    Maximum Entropy and Probability Kinematics Constrained by Conditionals.Stefan Lukits - 2015 - Entropy 17 (4):1690-1700.
    Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (pme) give a solution to the obverse Majerník problem; and (2) is Wagner correct when he claims that Jeffrey’s updating principle (jup) contradicts pme? Majerník shows that pme provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether pme also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  13.  30
    In defense of the maximum entropy inference process.J. Paris & A. Vencovská - 1997 - International Journal of Approximate Reasoning 17 (1):77-103.
    This paper is a sequel to an earlier result of the authors that in making inferences from certain probabilistic knowledge bases the maximum entropy inference process, ME, is the only inference process respecting “common sense.” This result was criticized on the grounds that the probabilistic knowledge bases considered are unnatural and that ignorance of dependence should not be identified with statistical independence. We argue against these criticisms and also against the more general criticism that ME is representation dependent. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   24 citations  
  14.  98
    An ecological approach to biosystem thermodynamics.Lionel Johnson - 1992 - Biology and Philosophy 7 (1):35-60.
    The general attributes of ecosystems are examined and a naturally occurring reference ecosystem is established, comparable with the isolated system of classical thermodynamics. Such an autonomous system with a stable, periodic input of energy is shown to assume certain structural characteristics that have an identifiable thermodynamic basis. Individual species tend to assume a state of least dissipation; this is most clearly evident in the dominant species (the species with the best integration of energy acquisition and conservation). It is concluded that (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  15.  33
    Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  16. Why are Normal Distributions Normal?Aidan Lyon - 2014 - British Journal for the Philosophy of Science 65 (3):621-649.
    It is usually supposed that the central limit theorem explains why various quantities we find in nature are approximately normally distributed—people's heights, examination grades, snowflake sizes, and so on. This sort of explanation is found in many textbooks across the sciences, particularly in biology, economics, and sociology. Contrary to this received wisdom, I argue that in many cases we are not justified in claiming that the central limit theorem explains why a particular quantity is normally distributed, and that in some (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   13 citations  
  17.  18
    Connecting De Donder’s equation with the differential changes of thermodynamic potentials: understanding thermodynamic potentials.Mihalj Poša - 2024 - Foundations of Chemistry 26 (2):275-290.
    The new mathematical connection of De Donder’s differential entropy production with the differential changes of thermodynamic potentials (Helmholtz free energy, enthalpy, and Gibbs free energy) was obtained through the linear sequence of equations (direct, straightforward path), in which we use rigorous thermodynamic definitions of the partial molar thermodynamic properties. This new connection uses a global approach to the problem of reversibility and irreversibility, which is vital to global learners’ view and standardizes the linking procedure for thermodynamic potentials (Helmholtz (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  18.  84
    Maximum entropy inference as a special case of conditionalization.Brian Skyrms - 1985 - Synthese 63 (1):55 - 74.
  19. Can the maximum entropy principle be explained as a consistency requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   28 citations  
  20.  95
    Probabilistic stability, agm revision operators and maximum entropy.Krzysztof Mierzewski - 2020 - Review of Symbolic Logic:1-38.
    Several authors have investigated the question of whether canonical logic-based accounts of belief revision, and especially the theory of AGM revision operators, are compatible with the dynamics of Bayesian conditioning. Here we show that Leitgeb's stability rule for acceptance, which has been offered as a possible solution to the Lottery paradox, allows to bridge AGM revision and Bayesian update: using the stability rule, we prove that AGM revision operators emerge from Bayesian conditioning by an application of the principle of (...) entropy. In situations of information loss, or whenever the agent relies on a qualitative description of her information state - such as a plausibility ranking over hypotheses, or a belief set - the dynamics of AGM belief revision are compatible with Bayesian conditioning; indeed, through the maximum entropy principle, conditioning naturally generated AGM revision operators. This mitigates an impossibility theorem of Lin and Kelly for tracking Bayesian conditioning with AGM revision, and suggests an approach to the compatibility problem that highlights the information loss incurred by acceptance rules in passing from probabilistic to qualitative representations of beliefs. (shrink)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  21.  20
    Causal versions of maximum entropy and principle of insufficient reason.Dominik Janzing - 2021 - Journal of Causal Inference 9 (1):285-301.
    The principle of insufficient reason assigns equal probabilities to each alternative of a random experiment whenever there is no reason to prefer one over the other. The maximum entropy principle generalizes PIR to the case where statistical information like expectations are given. It is known that both principles result in paradoxical probability updates for joint distributions of cause and effect. This is because constraints on the conditional P P\left result in changes of P P\left that assign higher probability (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  22. The principle of maximum entropy and a problem in probability kinematics.Stefan Lukits - 2014 - Synthese 191 (7):1-23.
    Sometimes we receive evidence in a form that standard conditioning (or Jeffrey conditioning) cannot accommodate. The principle of maximum entropy (MAXENT) provides a unique solution for the posterior probability distribution based on the intuition that the information gain consistent with assumptions and evidence should be minimal. Opponents of objective methods to determine these probabilities prominently cite van Fraassen’s Judy Benjamin case to undermine the generality of maxent. This article shows that an intuitive approach to Judy Benjamin’s case supports (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  23.  22
    Maximum-entropy spectral analysis of extended energy-loss fine structure and its application to time-resolved measurement.Shunsuke Muto † - 2004 - Philosophical Magazine 84 (25-26):2793-2808.
  24.  20
    Entropy production during interdiffusion under internal stress.Bartek Wierzba & Marek Danielewski - 2011 - Philosophical Magazine 91 (24):3228-3241.
  25. Objective Bayesianism and the maximum entropy principle.Jürgen Landes & Jon Williamson - 2013 - Entropy 15 (9):3528-3591.
    Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities, they should be calibrated to our evidence of physical probabilities, and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   19 citations  
  26. Common sense and maximum entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.
    This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson (1980), Paris and Vencovská (1990), and Csiszár (1989), it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  27.  6
    Model and Simulation of Maximum Entropy Phrase Reordering of English Text in Language Learning Machine.Weifang Wu - 2020 - Complexity 2020:1-9.
    This paper proposes a feature extraction algorithm based on the maximum entropy phrase reordering model in statistical machine translation in language learning machines. The algorithm can extract more accurate phrase reordering information, especially the feature information of reversed phrases, which solves the problem of imbalance of feature data during maximum entropy training in the original algorithm, and improves the accuracy of phrase reordering in translation. In the experiment, they were combined with linguistic features such as parts (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  28. Analysis of the maximum entropy principle “debate”.John F. Cyranski - 1978 - Foundations of Physics 8 (5-6):493-506.
    Jaynes's maximum entropy principle (MEP) is analyzed by considering in detail a recent controversy. Emphasis is placed on the inductive logical interpretation of “probability” and the concept of “total knowledge.” The relation of the MEP to relative frequencies is discussed, and a possible realm of its fruitful application is noted.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  29.  41
    On Entropy Production in the Madelung Fluid and the Role of Bohm’s Potential in Classical Diffusion.Eyal Heifetz, Roumen Tsekov, Eliahu Cohen & Zohar Nussinov - 2016 - Foundations of Physics 46 (7):815-824.
    The Madelung equations map the non-relativistic time-dependent Schrödinger equation into hydrodynamic equations of a virtual fluid. While the von Neumann entropy remains constant, we demonstrate that an increase of the Shannon entropy, associated with this Madelung fluid, is proportional to the expectation value of its velocity divergence. Hence, the Shannon entropy may grow due to an expansion of the Madelung fluid. These effects result from the interference between solutions of the Schrödinger equation. Growth of the Shannon (...) due to expansion is common in diffusive processes. However, in the latter the process is irreversible while the processes in the Madelung fluid are always reversible. The relations between interference, compressibility and variation of the Shannon entropy are then examined in several simple examples. Furthermore, we demonstrate that for classical diffusive processes, the “force” accelerating diffusion has the form of the positive gradient of the quantum Bohm potential. Expressing then the diffusion coefficient in terms of the Planck constant reveals the lower bound given by the Heisenberg uncertainty principle in terms of the product between the gas mean free path and the Brownian momentum. (shrink)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  30.  35
    Maximum Entropy Applied to Inductive Logic and Reasoning.Jürgen Landes & Jon Williamson (eds.) - 2015 - Ludwig-Maximilians-Universität München.
    This editorial explains the scope of the special issue and provides a thematic introduction to the contributed papers.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  31.  14
    Rumor Identification with Maximum Entropy in MicroNet.Suisheng Yu, Mingcai Li & Fengming Liu - 2017 - Complexity:1-8.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  32. The constraint rule of the maximum entropy principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   22 citations  
  33. Bertrand's Paradox and the Maximum Entropy Principle.Nicholas Shackel & Darrell P. Rowbottom - 2019 - Philosophy and Phenomenological Research 101 (3):505-523.
    An important suggestion of objective Bayesians is that the maximum entropy principle can replace a principle which is known to get into paradoxical difficulties: the principle of indifference. No one has previously determined whether the maximum entropy principle is better able to solve Bertrand’s chord paradox than the principle of indifference. In this paper I show that it is not. Additionally, the course of the analysis brings to light a new paradox, a revenge paradox of the (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  34. Application of the maximum entropy principle to nonlinear systems far from equilibrium.H. Haken - 1993 - In E. T. Jaynes, Walter T. Grandy & Peter W. Milonni (eds.), Physics and probability: essays in honor of Edwin T. Jaynes. New York: Cambridge University Press. pp. 239.
  35.  34
    The W systems: between maximum entropy and minimal ranking….Michael Freund - 1994 - Journal of Applied Non-Classical Logics 4 (1):79-90.
  36.  17
    A Novel Chinese Entity Relationship Extraction Method Based on the Bidirectional Maximum Entropy Markov Model.Chengyao Lv, Deng Pan, Yaxiong Li, Jianxin Li & Zong Wang - 2021 - Complexity 2021:1-8.
    To identify relationships among entities in natural language texts, extraction of entity relationships technically provides a fundamental support for knowledge graph, intelligent information retrieval, and semantic analysis, promotes the construction of knowledge bases, and improves efficiency of searching and semantic analysis. Traditional methods of relationship extraction, either those proposed at the earlier times or those based on traditional machine learning and deep learning, have focused on keeping relationships and entities in their own silos: extracting relationships and entities are conducted in (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  37.  17
    Which forces reduce entropy production?Alfred Hubler - 2014 - Complexity 19 (5):6-7.
  38.  9
    Entropy production and lost work for some irreversible processes.F. Di Liberto - 2007 - Philosophical Magazine 87 (3-5):569-579.
  39.  14
    Dissipated energy and entropy production for an unconventional heat engine: the stepwise ‘circular cycle’.Francesco di Liberto, Raffaele Pastore & Fulvio Peruggi - 2011 - Philosophical Magazine 91 (13-15):1864-1876.
  40.  32
    Explaining default intuitions using maximum entropy.Rachel A. Bourne - 2003 - Journal of Applied Logic 1 (3-4):255-271.
  41.  35
    Vehicle Text Data Compression and Transmission Method Based on Maximum Entropy Neural Network and Optimized Huffman Encoding Algorithms.Jingfeng Yang, Zhenkun Zhang, Nanfeng Zhang, Ming Li, Yanwei Zheng, Li Wang, Yong Li, Ji Yang, Yifei Xiang & Yu Zhang - 2019 - Complexity 2019:1-9.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  42.  58
    Enriching the knowledge sources used in a maximum entropy part-of-speech tagger.Christopher Manning - manuscript
    Kristina Toutanova Christopher D. Manning Dept of Computer Science Depts of Computer Science and Linguistics Gates Bldg 4A, 353 Serra Mall Gates Bldg 4A, 353 Serra Mall Stanford, CA 94305–9040, USA Stanford, CA 94305–9040, USA [email protected] [email protected]..
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  43.  43
    How to exploit parametric uniformity for maximum entropy reasoning in a relational probabilistic logic.Marc Finthammer & Christoph Beierle - 2012 - In Luis Farinas del Cerro, Andreas Herzig & Jerome Mengin (eds.), Logics in Artificial Intelligence. Springer. pp. 189--201.
  44.  1
    Microscopic Legendre Transform, Canonical Ensemble and Jaynes’ Maximum Entropy Principle.Ramandeep S. Johal - 2025 - Foundations of Physics 55 (1):1-13.
    Legendre transform between thermodynamic quantities such as the Helmholtz free energy and entropy plays a key role in the formulation of the canonical ensemble. In the standard treatment, the transform exchanges the independent variable from the system’s internal energy to its conjugate variable—the inverse temperature of the heat reservoir. In this article, we formulate a microscopic version of the transform between the free energy and Shannon entropy of the system, where the conjugate variables are the microstate probabilities and (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  45. The status of the principle of maximum entropy.Abner Shimony - 1985 - Synthese 63 (1):35 - 53.
  46.  13
    Combining probabilistic logic programming with the power of maximum entropy.Gabriele Kern-Isberner & Thomas Lukasiewicz - 2004 - Artificial Intelligence 157 (1-2):139-202.
  47.  25
    A fuzzy neuron based upon maximum entropy ordered weighted averaging.Michael O'Hagan - 1991 - In Bernadette Bouchon-Meunier, Ronald R. Yager & Lotfi A. Zadeh (eds.), Uncertainty in Knowledge Bases: 3rd International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems, IPMU'90, Paris, France, July 2 - 6, 1990. Proceedings. Springer. pp. 598--609.
  48.  45
    A look back: Early applications of maximum entropy estimation to quantum statistical mechanics.D. J. Scalapino - 1993 - In E. T. Jaynes, Walter T. Grandy & Peter W. Milonni (eds.), Physics and probability: essays in honor of Edwin T. Jaynes. New York: Cambridge University Press. pp. 9.
  49.  18
    (1 other version)Life Defined in Terms of Entropy Production: 20th Century Physics Meets 21st Century Biology.Leonid M. Martyushev - 2020 - Bioessays 42 (9):2000101.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  50.  7
    Lost work, extra work and entropy production for a system with complexity: The stepwise ideal-gas Carnot cycle.F. di Liberto - 2008 - Philosophical Magazine 88 (33-35):4177-4187.
1 — 50 / 974