Entropy and life
Research concerning the relationship between the thermodynamic quantity entropy and the evolution of life began around the turn of the 20th century. In 1910, American historian Henry Adams printed and distributed to university libraries and history professors the small volume A Letter to American Teachers of History proposing a theory of history based on the second law of thermodynamics and on the principle of entropy. The 1944 book What is Life? by Nobel-laureate physicist Erwin Schrödinger stimulated research in the field. In his book, Schrödinger originally stated that life feeds on negative entropy, or negentropy as it is sometimes called, but in a later edition corrected himself in response to complaints and stated the true source is free energy. More recent work has restricted the discussion to Gibbs free energy because biological processes on Earth normally occur at a constant temperature and pressure, such as in the atmosphere or at the bottom of an ocean, but not across both over short periods of time for individual organisms.
In 1863, Rudolf Clausius published his noted memoir "On the Concentration of Rays of Heat and Light, and on the Limits of Its Action" wherein he outlined a preliminary relationship, as based on his own work and that of William Thomson (Lord Kelvin), between his newly developed concept of entropy and life. Building on this, one of the first to speculate on a possible thermodynamic perspective of evolution was the Austrian physicist Ludwig Boltzmann. In 1875, building on the works of Clausius and Kelvin, Boltzmann reasoned:
The general struggle for existence of animate beings is not a struggle for raw materials – these, for organisms, are air, water and soil, all abundantly available – nor for energy which exists in plenty in any body in the form of heat, but a struggle for [negative] entropy, which becomes available through the transition of energy from the hot sun to the cold earth.
In 1876, American civil engineer Richard Sears McCulloh, in his Treatise on the Mechanical Theory of Heat and its Application to the Steam-Engine, which was an early thermodynamics textbook, states, after speaking about the laws of the physical world, that "there are none that are established on a firmer basis than the two general propositions of Joule and Carnot; which constitute the fundamental laws of our subject." McCulloch then goes on to show that these two laws may be combined in a single expression as follows:
McCulloch then declares that the applications of these two laws, i.e. what are currently known as the first law of thermodynamics and the second law of thermodynamics, are innumerable. He then states:
When we reflect how generally physical phenomena are connected with thermal changes and relations, it at once becomes obvious that there are few, if any, branches of natural science which are not more or less dependent upon the great truths under consideration. Nor should it, therefore, be a matter of surprise that already, in the short space of time, not yet one generation, elapsed since the mechanical theory of heat has been freely adopted, whole branches of physical science have been revolutionized by it.:p. 267
McCulloch then gives a few of what he calls the “more interesting examples” of the application of these laws in extent and utility. The first example he gives is physiology, wherein he states that “the body of an animal, not less than a steamer, or a locomotive, is truly a heat engine, and the consumption of food in the one is precisely analogous to the burning of fuel in the other; in both, the chemical process is the same: that called combustion.” He then incorporates a discussion of Lavoisier’s theory of respiration with cycles of digestion and excretion, perspiration, but then contradicts Lavoisier with recent findings, such as internal heat generated by friction, according to the new theory of heat, which, according to McCulloch, states that the “heat of the body generally and uniformly is diffused instead of being concentrated in the chest”. McCulloch then gives an example of the second law, where he states that friction, especially in the smaller blooded-vessels, must develop heat. Without doubt, animal heat is thus in part produced. He then asks: “but whence the expenditure of energy causing that friction, and which must be itself accounted for?"
To answer this question he turns to the mechanical theory of heat and goes on to loosely outline how the heart is what he calls a “force-pump”, which receives blood and sends it to every part of the body, as discovered by William Harvey, that “acts like the piston of an engine and is dependent upon and consequently due to the cycle of nutrition and excretion which sustains physical or organic life.” It is likely, here, that McCulloch was modeling parts of this argument on that of the famous Carnot cycle. In conclusion, he summarizes his first and second law argument as such:
Everything physical being subject to the law of conservation of energy, it follows that no physiological action can take place except with expenditure of energy derived from food; also, that an animal performing mechanical work must from the same quantity of food generate less heat than one abstaining from exertion, the difference being precisely the heat equivalent of that of work.:p. 270
In the 1944 book What is Life?, Nobel-laureate physicist Erwin Schrödinger theorizes that life, contrary to the general tendency dictated by the Second law of thermodynamics, decreases or maintains its entropy by feeding on negative entropy. In his note to Chapter 6 of What is Life?, however, Schrödinger remarks on his usage of the term negative entropy:
Let me say first, that if I had been catering for them [physicists] alone I should have let the discussion turn on free energy instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to energy for making the average reader alive to the contrast between the two things.
This is what is argued to differentiate life from other forms of matter organization. In this direction, although life's dynamics may be argued to go against the tendency of second law, which states that the entropy of an isolated system tends to increase, it does not in any way conflict or invalidate this law, because the principle that entropy can only increase or remain constant applies only to a closed system which is adiabatically isolated, meaning no heat can enter or leave. Whenever a system can exchange either heat or matter with its environment, an entropy decrease of that system is entirely compatible with the second law. The problem of organization in living systems increasing despite the second law is known as the Schrödinger paradox.
In 1964, James Lovelock was among a group of scientists who were requested by NASA to make a theoretical life detection system to look for life on Mars during the upcoming space mission. When thinking about this problem, Lovelock wondered “how can we be sure that Martian life, if any, will reveal itself to tests based on Earth’s lifestyle?” To Lovelock, the basic question was “What is life, and how should it be recognized?” When speaking about this issue with some of his colleagues at the Jet Propulsion Laboratory, he was asked what he would do to look for life on Mars. To this, Lovelock replied "I’d look for an entropy reduction, since this must be a general characteristic of life."
Gibbs free energy and biological evolution
In recent years, the thermodynamic interpretation of evolution in relation to entropy has begun to utilize the concept of the Gibbs free energy, rather than entropy. This is because biological processes on earth take place at roughly constant temperature and pressure, a situation in which the Gibbs free energy is an especially useful way to express the second law of thermodynamics. The Gibbs free energy is given by:
The minimization of the Gibbs free energy is a form of the principle of minimum energy, which follows from the entropy maximization principle for closed systems. Moreover, the Gibbs free energy equation, in modified form, can be utilized for open systems when chemical potential terms are included in the energy balance equation. In a popular 1982 textbook, Principles of Biochemistry, noted American biochemist Albert Lehninger argues that the order produced within cells as they grow and divide is more than compensated for by the disorder they create in their surroundings in the course of growth and division. In short, according to Lehninger, "living organisms preserve their internal order by taking from their surroundings free energy, in the form of nutrients or sunlight, and returning to their surroundings an equal amount of energy as heat and entropy."
Similarly, according to the chemist John Avery, from his 2003 book Information Theory and Evolution, we find a presentation in which the phenomenon of life, including its origin and evolution, as well as human cultural evolution, has its basis in the background of thermodynamics, statistical mechanics, and information theory. The (apparent) paradox between the second law of thermodynamics and the high degree of order and complexity produced by living systems, according to Avery, has its resolution "in the information content of the Gibbs free energy that enters the biosphere from outside sources." The process of natural selection responsible for such local increase in order may be mathematically derived directly from the expression of the second law equation for connected non-equilibrium open systems.
Entropy and the origin of life
The second law of thermodynamics applied on the origin of life is a far more complicated issue than the further development of life, since there is no "standard model" of how the first biological lifeforms emerged; only a number of competing hypotheses. The problem is discussed within the area of abiogenesis, implying gradual pre-Darwinian chemical evolution. In 1924, Alexander Oparin suggested that sufficient energy was provided in a primordial soup. The Belgian scientist Ilya Prigogine was awarded with a Nobel prize in 1977 for an analysis in this area. A related topic is the probability that life would emerge, which has been discussed in several studies, for example by Russell Doolittle.
Entropy and the search for life elsewhere in the Universe
In 2013 Azua-Bustos and Vega argued that disregarding the type of lifeform that could be envisioned both on Earth and elsewhere in the Universe, all should share in common the attribute of being entities that decrease their internal entropy at the expense of free energy obtained from its surroundings. As entropy allows the quantification of the degree of disorder in a system, any envisioned lifeform must have a higher degree of order than its supporting environment. These authors showed that by using fractal mathematics analysis alone, they could readily quantify the degree of structural complexity difference (and thus entropy) of living processes as distinct entities separate from their similar abiotic surroundings. This approach may allow the future detection of unknown forms of life both in the Solar System and on recently discovered exoplanets based on nothing more than entropy differentials of complementary datasets (morphology, coloration, temperature, pH, isotopic composition, etc.). 
Entropy in psychology
The notion of entropy as disorder has been transferred from thermodynamics to psychology by Polish psychiatrist Antoni Kępiński, who admitted being inspired by Erwin Schrödinger. In his theoretical framework devised to explain mental disorders (the information metabolism theory), the difference between living organisms and other systems was explained as the ability to maintain order. Contrary to inanimate matter, organisms maintain the particular order of their bodily structures and inner worlds which they impose onto their surroundings and forward to new generations. The life of an organism or the species ceases as soon as it loses that ability. Maintaining of that order requires continual exchange of information between the organism and its surroundings. In higher organisms, information is acquired mainly through receptors and metabolised in the nervous system. The result is action - some form of motion, for example locomotion, speech, internal motion of organs, secretion of hormones etc. The reaction of organism becomes an informational signal to other organisms. Information metabolism, which allows to maintain the order, is possible only if a hierarchy of value exists, as the signals coming to the organism must be structured. In humans that hierarchy has three levels i.e. biological, emotional and sociocultural. Kępiński explained how various mental disorders are caused by distortions of that hierarchy and the return to mental health is possible through its restoration.
The idea was continued by Struzik who proposed that Kępiński's information metabolism theory may be seen as an extension of Brilluoin's negentropy principle of information. In 2011 the notion of psychological entropy has been reintroduced to psychologists by Hirsh et al. Similarly to Kępiński, these authors noted that uncertainty management is a critical ability for any organism. Uncertainty, arising due to the conflict between competing perceptual and behavioral affordances, is experienced subjectively as anxiety. Hirsh and his collaborators proposed that both the perceptual and behavioral domains may be conceptualized as probability distributions and that the amount of uncertainty associated with a given perceptual or behavioral experience can be quantified in terms of Claude Shannon’s entropy formula.
For nearly a century and a half, beginning with Clausius' 1863 memoir "On the Concentration of Rays of Heat and Light, and on the Limits of its Action", much writing and research has been devoted to the relationship between thermodynamic entropy and the evolution of life. The argument that life feeds on negative entropy or negentropy was asserted by physicist Erwin Schrödinger in a 1944 book What is Life?. He posed, "How does the living organism avoid decay?" The obvious answer is: "By eating, drinking, breathing and (in the case of plants) assimilating." Recent writings have used the concept of Gibbs free energy to elaborate on this issue. While energy from nutrients is necessary to sustain an organism's order, there is also the Schrödinger prescience: "An organism's astonishing gift of concentrating a stream of order on itself and thus escaping the decay into atomic chaos – of drinking orderliness from a suitable environment – seems to be connected with the presence of the aperiodic solids..." We now know that the 'aperiodic' crystal is DNA and that the irregular arrangement is a form of information. "The DNA in the cell nucleus contains the master copy of the software, in duplicate. This software seems to control by "specifying an algorithm, or set of instructions, for creating and maintaining the entire organism containing the cell." DNA and other macromolecules determine an organism's life cycle: birth, growth, maturity, decline, and death. Nutrition is necessary but not sufficient to account for growth in size as genetics is the governing factor. At some point, organisms normally decline and die even while remaining in environments that contain sufficient nutrients to sustain life. The controlling factor must be internal and not nutrients or sunlight acting as causal exogenous variables. Organisms inherit the ability to create unique and complex biological structures; it is unlikely for those capabilities to be reinvented or be taught each generation. Therefore, DNA must be operative as the prime cause in this characteristic as well. Applying Boltzmann's perspective of the second law, the change of state from a more probable, less ordered and high entropy arrangement to one of less probability, more order, and lower entropy seen in biological ordering calls for a function like that known of DNA. DNA's apparent information processing function provides a resolution of the paradox posed by life and the entropy requirement of the second law.
In 1982, American biochemist Albert Lehninger argued that the "order" produced within cells as they grow and divide is more than compensated for by the "disorder" they create in their surroundings in the course of growth and division. "Living organisms preserve their internal order by taking from their surroundings free energy, in the form of nutrients or sunlight, and returning to their surroundings an equal amount of energy as heat and entropy."
- Negentropy – a shorthand colloquial phrase for negative entropy.
- Ectropy – a measure of the tendency of a dynamical system to do useful work and grow more organized.
- Extropy – a metaphorical term defining the extent of a living or organizational system's intelligence, functional order, vitality, energy, life, experience, and capacity and drive for improvement and growth.
- Ecological entropy – a measure of biodiversity in the study of biological ecology.
In a study titled "Natural selection for least action" published in the Proceedings of the Royal Society A., Ville Kaila and Arto Annila of the University of Helsinki describe how the second law of thermodynamics can be written as an equation of motion to describe evolution, showing how natural selection and the principle of least action can be connected by expressing natural selection in terms of chemical thermodynamics. In this view, evolution explores possible paths to level differences in energy densities and so increase entropy most rapidly. Thus, an organism serves as an energy transfer mechanism, and beneficial mutations allow successive organisms to transfer more energy within their environment.
This section needs expansion. You can help by adding to it. (December 2015)
Entropy is well defined for equilibrium systems, so objections to the extension of the second law and of entropy to biological systems, especially as it pertains to its use to support or discredit the theory of evolution, have been stated. Live systems and indeed much of the systems and processes in the universe operate far from equilibrium, whereas the second law succinctly states that isolated systems evolve toward thermodynamic equilibrium — the state of maximum entropy.
However, entropy is well defined much more broadly based on the probabilities of a system's states, whether or not the system is a dynamic one (for which equilibrium could be relevant). Even in those physical systems where equilibrium could be relevant, (1) live systems cannot persist in isolation and (2) the second principle of thermodynamics does not require that free energy be transformed into entropy along the shortest path: live organisms absorb energy from sunlight or from energy-rich chemical compounds and finally return part of such energy to the environment as entropy (heat and low free-energy compounds such as water and CO2).
- Adams, Henry. (1986). History of the United States of America During the Administration of Thomas Jefferson (pg. 1299). Library of America.
- Adams, Henry. (1910). A Letter to American Teachers of History. Google Books, Scanned PDF. Washington.
- Boltzmann, Ludwig (1974). The second law of thermodynamics (Theoretical physics and philosophical problems). Springer-Verlag New York, LLC. ISBN 978-90-277-0250-0.
- McCulloch, Richard Sears (1876). Treatise on the mechanical theory of heat and its applications to the steam-engine, etc. New York: D. Van Nostrand.
- Schrödinger, Erwin (1944). What is Life – the Physical Aspect of the Living Cell. Cambridge University Press. ISBN 978-0-521-42708-1.
- The common justification for this argument, for example, according to renowned chemical engineer Kenneth Denbigh, from his 1955 book The Principles of Chemical Equilibrium, is that "living organisms are open to their environment and can build up at the expense of foodstuffs which they take in and degrade."
- Schneider, Eric D.; Sagan, Dorion (2005). Into the Cool: Energy Flow Thermodynamics and Life. Chicago, United States: The University of Chicago Press. p. 15.
- Lovelock, James (1979). GAIA – A New Look at Life on Earth. Oxford University Press. ISBN 978-0-19-286218-1.
- Moroz, Adam (2012). The Common Extremalities in Biology and Physics. Elsevier. ISBN 978-0-12-385187-1.
- Lehninger, Albert (1993). Principles of Biochemistry, 2nd Ed. Worth Publishers. ISBN 978-0-87901-711-8.
- Avery, John (2003). Information Theory and Evolution. World Scientific. ISBN 978-981-238-399-0.
- Kaila, V. R.; Annila, A. (8 November 2008). "Natural selection for least action". Proceedings of the Royal Society A. 464 (2099): 3055–3070. Bibcode:2008RSPSA.464.3055K. doi:10.1098/rspa.2008.0178.
- Russell Doolittle, "The Probability and Origin of Life" in Scientists Confront Creationism (1984) Ed. Laurie R. Godfrey, p. 85
- Vega-Martínez, Cristian; Azua-Bustos, Armando (2013). "The potential for detecting 'life as we don't know it' by fractal complexity analysis". International Journal of Astrobiology. 12 (4): 314–320. doi:10.1017/S1473550413000177. ISSN 1475-3006.
- Kępiński, Antoni (1972). Rhythm of life (in Polish). Kraków: Wydawnictwo Literackie.
- Pietrak, Karol (2018). "The foundations of socionics - a review". Cognitive Systems Research. 47: 1–11. doi:10.1016/J.COGSYS.2017.07.001.
- Schochow, Maximilian; Steger, Florian (2016). "Antoni Kepiński (1918–1972), pioneer of post-traumatic stress disorder". The British Journal of Psychiatry. 208 (6): 590. doi:10.1192/bjp.bp.115.168237. PMID 27251694.
- Bulaczek, Aleksandra (2013). "Relations patient – doctor in axiological psychiatry of Antoni Kępiński (in Polish)" (PDF). Studia Ecologiae et Bioethicae UKSW. 11 (2): 9–28.
- Struzik, Tadeusz (1987). "Kepiński's Information Metabolism, Carnot's Principle and Information Theory". International Journal of Neuroscience. 36 (1–2): 105–111. doi:10.3109/00207458709002144.
- Hirsh, Jacob B.; Mar, Raymond A.; Peterson, Jordan B. (2012). "Psychological Entropy: A Framework for Understanding Uncertainty-Related Anxiety". Psychological Review. 119 (Advance online publication): 304–320. doi:10.1037/a0026767.
- Higgs, P. G., & Pudritz, R. E. (2009). "A thermodynamic basis for prebiotic amino acid synthesis and the nature of the first genetic code" Accepted for publication in Astrobiology
- Nelson, P. (2004). Biological Physics, Energy, Information, Life. W.H. Freeman and Company. ISBN 0-7167-4372-8
- Peterson, Jacob, Understanding the Thermodynamics of Biological Order, The American Biology Teacher, 74, Number 1, January 2012, pp. 22–24
- Lehninger, Albert (1993). Principles of Biochemistry, 2nd Ed. Worth Publishers. ISBN 978-0-87901-711-8.
- Schrödinger, Erwin (1944). What is Life – the Physical Aspect of the Living Cell. Cambridge University Press. ISBN 978-0-521-42708-1.
- Haddad, Wassim M.; Chellaboina, VijaySekhar; Nersesov, Sergey G. (2005). Thermodynamics – A Dynamical Systems Approach. Princeton University Press. ISBN 978-0-691-12327-1.
- Lisa Zyga (11 August 2008). "Evolution as Described by the Second Law of Thermodynamics". Physorg.com. Retrieved 14 August 2008.
- Callen, Herbert B (1985). Thermodynamics and an Introduction to Statistical Thermodynamics. John Wiley and Sons.
- Ben-Naim, Arieh (2012). Entropy and the Second Law. World Scientific Publishing.
- Schneider, E. and Sagan, D. (2005). Into the Cool: Energy Flow, Thermodynamics, and Life. University of Chicago Press, Chicago. ISBN 9780226739366
- Kapusta, A. (2007). Life circle, time and the self in Antoni Kępiński’s conception of information metabolism. FILOSOFIJA. SOCIOLOGIJA 18(1): 46–51.
- La Cerra, P. (2003). The First Law of Psychology is the Second Law of Thermodynamics: The Energetic Evolutionary Model of the Mind and the Generation of Human Psychological Phenomena, Human Nature Review 3: 440–447.
- Moroz, A. (2011). The Common Extremalities in Biology and Physics. Elsevier Insights, NY. ISBN 978-0-12-385187-1
- Thermodynamic Evolution of the Universe pi.physik.uni-bonn.de/~cristinz