WHAT'S NEW ABOUT THE SECOND LAW OF THERMODYNAMICS

by Brig Klyce from http://www.panspermia.org/seconlaw.htm

The use of thermodynamics in biology has a long history rich in confusion � Harold J. Morowitz (1)

Sometimes people say that life violates the second law of thermodynamics. This is not the case; we know of nothing in the universe that violates that law. So why do people say that life violates the second law of thermodynamics? What is the second law of thermodynamics?

The second law is a straightforward law of physics with the consequence that, in a closed system, you can't finish any real physical process with as much useful energy as you had to start with � some is always wasted. This means that a perpetual motion machine is impossible. The second law was formulated after nineteenth century engineers noticed that heat cannot pass from a colder body to a warmer body by itself.

According to philosopher of science Thomas Kuhn, the second law was first put into words by two scientists, Rudolph Clausius and William Thomson (Lord Kelvin), using different examples, in 1850-51 (2). American quantum physicist Richard P. Feynman, however, says the French physicist Sadi Carnot discovered the second law 25 years earlier. That would have been before the first law, conservation of energy, was discovered (3)! In any case, modern scientists completely agree about the above principles.

Thermodynamic Entropy

The first opportunity for confusion arises when we introduce the term entropy into the mix. Clausius invented the term in 1865. He had noticed that a certain ratio was constant in reversible, or ideal, heat cycles. The ratio was heat exchanged to absolute temperature. Clausius decided that the conserved ratio must correspond to a real, physical quantity, and he named it "entropy".

Surely not every conserved ratio corresponds to a real, physical quantity. Historical accident has introduced this term to science. On another planet there could be physics without the concept of entropy. It completely lacks intuitive clarity. Even the great physicist James Clerk Maxwell had it backward for a while (4). Nevertheless, the term has stuck.

The American Heritage Dictionary gives as the first definition of entropy, "For a closed system, the quantitative measure of the amount of thermal energy not available to do work." So it's a negative kind of quantity, the opposite of available energy.

Today, it is customary to use the term entropy to state the second law: Entropy in a closed system can never decrease. As long as entropy is defined as unavailable energy, the paraphrasing just given of the second law is equivalent to the earlier ones above. In a closed system, available energy can never increase, so its opposite, entropy, can never decrease.

A familiar demonstration of the second law is the flow of heat from hot things to cold, and never vice-versa. When a hot stone is dropped into a bucket of cool water, the stone cools and the water warms until each is the same temperature as the other. During this process, the entropy of the system increases. If you know the initial temperatures of the stone and the water, and the final temperature of the water, you can quantify the entropy increase in calories or joules per degree.

You may have noticed the words "closed system" a couple of times above. Consider simply a black bucket of water initially at the same temperature as the air around it. If the bucket is placed in bright sunlight, it will absorb heat from the sun, as black things do. Now the water becomes warmer than the air around it, and the available energy has increased. Has entropy decreased? Has energy that was previously unavailable become available, in a closed system? No, this example is only an apparent violation of the second law. Because sunlight was admitted, the local system was not closed; the energy of sunlight was supplied from outside the local system. If we consider the larger system, including the sun, available energy has decreased and entropy has increased as required.

Let's call this kind of entropy thermodynamic entropy. The qualifier "thermodynamic" is necessary because the word entropy is also used in another, nonthermodynamic sense.

Logical Entropy

Entropy is also used to mean disorganization or disorder. J. Willard Gibbs, the nineteenth century American theoretical physicist, called it "mixedupness." The American Heritage Dictionary gives as the second definition of entropy, "a measure of disorder or randomness in a closed system." Again, it's a negative concept, this time the opposite of organization or order. The term came to have this second meaning thanks to the great Austrian physicist Ludwig Boltzmann.

Boltzmann
Boltzmann
In Boltzmann's day, one complaint about the second law of thermodynamics was that it seemed to impose upon nature a preferred direction in time. Under the second law, things can only go one way. This apparently conflicts with the laws of physics at the molecular level, where there is no preferred direction in time � an elastic collision between molecules would look the same going forward or backward. In the 1880s and 1890s, Boltzmann used molecules of gas as a model, along with the laws of probability, to show that there was no real conflict. chanel replica The model showed that heat, no matter how it was introduced, would soon become evenly diffused throughout the gas, as the second law required.

The model could also be used to show that two different kinds of gasses would become thoroughly mixed. The reasoning he used for mixing is very similar to that for the diffusion of heat, but there is an important difference. In the diffusion of heat, the entropy increase can be measured with the ratio of physical units, joules per degree. In the mixing of two kinds of gasses already at the same temperature, if no heat is exchanged, the ratio of joules per degree � thermodynamic entropy � is irrelevant. The mixing process is related to the diffusion of heat only by analogy (5). Nevertheless, Boltzmann used a factor, now called Boltzmann's constant, to attach physical units to the latter situation. chanel replica handbags Now the word entropy has come to be applied to the mechanical mixing process, too. (Of course, Boltzmann's constant has a legitimate use � it relates the average kinetic energy of a molecule to its temperature.)

Entropy in this sense came to be used in the growing fields of information science, computer science, communications theory, etc. chanel replica sale The story is often told that in the late 1940s, John von Neumann, a pioneer of the computer age, advised communication-theorist Claude E. Shannon to start using the term entropy when discussing information because "no one knows what entropy really is, so in a debate you will always have the advantage" (6).

Richard Feynman knew there is a difference between the two meanings of entropy. He discussed thermodynamic entropy in the section called "Entropy" of his Lectures on Physics published in 1963 (7), using physical units, joules per degree, and over a dozen equations (vol I section 44-6). He discussed the second meaning of entropy in a different section titled "Order and entropy" (vol I section 46-5) as follows:

Feynman

"So we now have to talk about what we mean by disorder and what we mean by order. ... Suppose we divide the space into little volume elements. If we have black and white molecules, how many ways could we distribute them among the volume elements so that white is on one side and black is on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure "disorder" by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy. The number of ways in the separated case is less, so the entropy is less, or the "disorder" is less."

This is Boltzmann's model again. Notice that Feynman does not use Boltzmann's constant. He assigns no physical units to this kind of entropy, just a number. (A logarithm is a number, without physical units.) And he uses not a single equation in this section of his Lectures.

Notice another thing. The "number of ways" can only be established by first artificially dividing up the space into little volume elements. This is not a small point. In every real physical situation, counting the number of possible arrangements requires an arbitrary parceling. As Peter Coveney and Roger Highfield say (7.5):

"There is, however, nothing to tell us how fine the [parceling] should be. Entropies calculated in this way depend on the size-scale decided upon, in direct contradiction with thermodynamics in which entropy changes are fully objective."

Claude Shannon himself seems to be aware of these differences in his famous 1948 paper, "A Mathematical Theory of Communcation." With respect to the parcelling he writes, "In the continuous case the measurement is relative to the coordinate system. If we change coordinates the entropy will in general change" (Shannon's italics) (8). In the same paper he attaches no physical units to his entropy and never mentions Boltzmann's constant, k. At one point he briefly introduces K, saying tersely, "The constant K merely amounts to a choice of a unit of measure." Shannon never specifies the unit of measure, and except in an appendix, K does not appear again in the 55-page paper.

This sort of entropy is clearly different. Physical units do not pertain to it, and (except in the case of digital information) an arbitrary convention must be imposed before it can be quantified. To distinguish this kind of entropy from thermodynamic entropy, let's call it logical entropy.

In spite of the important distinction between the two meanings of entropy, the rule as stated above for thermodynamic entropy seems to apply nonetheless to the logical kind: entropy in a closed system can never decrease. And really, there would be nothing mysterious about this law either. chanel replica It's similar to saying things never organize themselves. (The original meaning of organize is "to furnish with organs.") Only this rule has little to do with thermodynamics.

It is true that crystals and other regular configurations can be formed by unguided processes. And we are accustomed to saying that these configurations are "organized." But of course, crystals have not been spontaneously "furnished with organs." The correct term for such regular configurations is "ordered." The recipe for a crystal is already present in the solution it grows from � the crystal lattice is prescribed by the structure of the molecules that compose it. The formation of crystals is the straightforward result of chemical and physical laws that do not evolve and that are, compared to genetic programs, very simple.

The rule that things never organize themselves is also upheld in our everyday experience. Without someone to fix it, a broken glass never mends. Without maintenance, a house deteriorates. Without management, a business fails. chanel replica handbags Without new software, a computer never acquires new capabilities. Never.

Charles Darwin understood this universal principle. It's common sense. That's why he once made a note to himself pertaining to evolution, "Never use the words higher or lower" (9). (However, the word "higher" in this forbidden sense appears half a dozen times in the first edition of Darwin's Origin of Species (10).)

Even today, if you assert that a human is more highly evolved than a flatworm or an amoeba, there are darwinists who'll want to fight about it. They take the position, apparently, that evolution has not necessarily shown a trend toward more highly organized forms of life, just different forms.

  • All extant species are equally evolved. � Lynn Margulis and Dorion Sagan, 1995 (11)
  • There is no progress in evolution. � Stephen Jay Gould, 1995 (12)
  • We all agree that there's no progress. � Richard Dawkins, 1995 (13)
  • The fallacy of progress � John Maynard Smith and E�rs Szathm�ry, 1995 (14)
But this ignores the plain facts about life and evolution.

Life is Organization

Life Is Organization
Seen in retrospect, evolution as a whole doubtless had a general direction, from simple to complex, from dependence on to relative independence of the environment, to greater and greater autonomy of individuals, greater and greater development of sense organs and nervous systems conveying and processing information about the state of the organism's surroundings, and finally greater and greater consciousness. You can call this direction progress or by some other name. � Theodosius Dobzhansky (15)

Progress, then, is a property of the evolution of life as a whole by almost any conceivable intuitive standard.... Let us not pretend to deny in our philosophy what we know in our hearts to be true. � Edward O. Wilson (16)

Life is organization. From prokaryotic cells, eukaryotic cells, tissues, and organs, to plants and animals, families, communities, ecosystems, and living planets, life is organization, at every scale. The evolution of life is the increase of biological organization, if it is anything. Clearly, if life originates and makes evolutionary progress without organizing input from outside, then something has organized itself. Logical entropy in a closed system has decreased. This is the violation that people are getting at, when they mistakenly say that life violates the second law of thermodynamics. This violation, the decrease of logical entropy in a closed system, must happen continually in the darwinian account of evolutionary progress.

Most darwinists just ignore this staggering problem. When confronted with it, they seek refuge in the confusion between the two kinds of entropy. Entropy [logical] has not decreased, they say, because the system is not closed. Energy such as sunlight is constantly supplied to the system. If you consider the larger system that includes the sun, entropy [thermodynamic] has increased, as required.

Recent Writing About Entropy and Biology

An excellent example of this confusion is given in a popular 1982 treatise against creationism, Abusing Science, by Philip Kitcher. He is aware that entropy has different meanings, but he treats them as not different: "There are various ways to understand entropy.... I shall follow the approach of classical thermodynamics, in which entropy is seen as a function of unusable energy. But the points I make will not be affected by this choice" (17).

Another typical example of confusion between the two kinds of entropy comes from a similar book by Tim M. Berra, Evolution and the Myth of Creationism. The following paragraph from that book would seem to indicate that any large animal can assemble a bicycle (18).

"For example, an unassembled bicycle that arrives at your house in a shipping carton is in a state of disorder. You supply the energy of your muscles (which you get from food that came ultimately from sunlight) to assemble the bike. You have got order from disorder by supplying energy. The Sun is the source of energy input to the earth's living systems and allows them to evolve."

A rare example of the use of mathematics to combine the two kinds of entropy is given in The Mystery of Life's Origin, published in 1984. Its authors acknowledge two kinds of entropy, which they call "thermal" and "configurational." To count the "number of ways" for the latter kind of entropy they use restrictions which they later admit to be unrealistic. They count only the number of ways a string of amino acids of fixed length can be sequenced. They admit in the end, however, that the string might never form. To impose the units joules per degree onto "configurational" entropy, they simply multiply by Boltzmann's constant (19). Nevertheless, they ultimately reach the following conclusion (p 157-158):

"In summary, undirected thermal energy is only able to do the chemical and thermal entropy work in polypetide synthesis, but not the coding (or sequencing) portion of the configurational entropy work.... It is difficult to imagine how one could ever couple random thermal energy flow through the system to do the required configurational entropy work of selecting and sequencing."

In Evolution, Thermodynamics and Information, Jeffrey S. Wicken also adopts the terms "thermal" and "configurational." But here they both pertain only to the non-energetic "information content" of a thermodynamic state, and "energetic" information is also necessary for the complete description of a system. Shannon entropy is different from all of these, and not a useful concept to Wicken. Nevertheless, he says that evolution and the origin of life are not separate problems and, "The most parsimonious explanation is to assume that life always existed" (19.5)!

Roger Penrose's treatment of entropy is worth mentioning. In The Emperor's New Mind, he nimbly dodges the problem of assigning physical units to logical entropy (20):

"In order to give the actual entropy values for these compartments we should have to worry a little about the question of the units that are chosen (metres, Joules, kilograms, degrees Kelvin, etc.). That would be out of place here, and in fact, for the utterly stupendous entropy values that I shall be giving shortly, it makes essentially no difference at all what units are in fact chosen. However, for definiteness (for the experts), let me say that I shall be taking natural units, as are provided by the rules of quantum mechanics, and for which Boltzmann's constant turns out to be unity: k = 1."

Penrose
Penrose
Someday, in the distant future, an extension of quantum theory might provide a natural way to parcel any real physical situation. If that happens, one of the problems with quantifying logical entropy in a real physical situation will be removed. But nobody, not even Penrose, is suggesting that this is the case today. And even if that day comes, we will still have no reason to attach thermodynamic units to logical entropy. (Although the word "stupendous" appears again, no "actual entropy values" follow the quoted passage.)

Martin Goldstein and Inge F. Goldstein in The Refrigerator and the Universe (21) wonder if there is "an irreconcilable difference" between the two kinds of entropy. They begin their consideration of logical entropy by discussing the possible arrangements of playing cards, where the parceling is not arbitrary � the number of possibilities can be counted. When they move to the world of physics, they are not concerned over the fact that parceling must now be done arbitrarily. They are concerned, initially, about attaching physical units to logical entropy. "...Entropy is measured in units of energy divided by temperature.... W [counting microstates] is a pure number" (p 173). But ultimately they apply Boltzmann's constant. No calculations using logical entropy with physical units ensue. The next time they mention logical entropy is in the section "Information and Entropy," where they divide the previous product by Boltzmann's constant to remove the physical units (p 218)!

An ambitious treatment of entropy as it pertains to biology is the book Evolution as Entropy, by Daniel R. Brooks and E. O. Wiley. The authors acknowledge that the distinction between the different kinds of entropy is important (22):

"It is important to realize that the phase space, microstates, and macrostates described in our theory are not classical thermodynamic constructs.... The entropies are array entropies, more like the entropies of sorting encountered in considering an ideal gas than like the thermal entropies associated with steam engines...."

In fact the authors acknowledge many kinds of entropy; they describe physical entropy, Shannon-Weaver entropy, cohesion entropy, and statistical entropy, for example. They rarely use or mention Boltzmann's constant. One of their main arguments is that although the progress of evolution seems to represent a reduction in entropy, this reduction is only apparent. In reality, evolution increases entropy, as the law requires. But evolution does not increase entropy as fast as the maximum possible rate. So, by comparison to the maximum possible rate, entropy appears to be decreasing. Our eyes have deceived us!

In another book entitled Life Itself, mathematical biologist Robert Rosen of Columbia University seems to have grasped the problem when he writes, "The Second Law thus asserts that... a system autonomously tending to an organized state cannot be closed " (23). But immediately he veers away, complaining that the term "organization" is vague. Intent on introducing terms he prefers, like "entailment," he does not consider the possibility that, in an open system, life's organization could be imported into one region from another.

Hans Christian von Baeyer's 1998 book, Maxwell's Demon, is engaging and informative about the scientists who pioneered the second law. The story concludes with an interview of Wojciech Zurek of the Theoretical Division of the Los Alamos National Laboratory. Zurek introduces another second kind of entropy, because "Like all scientific ideas, the concept of entropy, useful as it is, needs to be refurbished and updated and adjusted to new insights. Someday... the two types of entropy will begin to approach each other in value, and the new theory will become amenable to experimental verification" (23.5).

One of the most profound and original treatments of entropy is that by the Nobel prize-winning chemist Ilya Prigogine. He begins by noticing that some strictly chemical processes create surprising patterns such as snowflakes, or exhibit surprising behavior such as oscillation between different states. In From Being To Becoming he says, in effect, that things sometimes do, under certain circumstances, organize themselves. He reasons that these processes may have produced life (24):

"It seems that most biological mechanisms of action show that life involves far-from-equilibrium conditions beyond the stability of the threshold of the thermodynamic branch. It is therefore very tempting to suggest that the origin of life may be related to successive instabilities somewhat analogous to the successive bifurcations that have lead to a state of matter of increasing coherence."

Some find such passages obscure and tentative. One critic complains that work along the lines advocated by Prigogine fifteen years ago has borne little fruit subsequently. " 'I don't know of a single phenomenon he has explained,' says Pierre C. Hohenberg of Yale University" (25).

Dr. Hubert P. Yockey gives the subject of entropy and biology a probing and insightful treatment in his monograph, Information theory and molecular biology (26). He emphatically agrees that there are different kinds of entropy that do not correlate. "...The Shannon entropy and the Maxwell-Boltzmann-Gibbs entropy... have nothing to do with each other" (p 313). But Shannon entropy (which pertains to information theory) makes no distinction between meaningful DNA sequences that encode life and random DNA sequences of equal length. Therefore, Yockey is able to conclude that evolution does not create any paradox for Shannon entropy. Nevertheless, Yockey proves with impressive command of biology and statistics that it would be impossible to find the new genes necessary for evolutionary progress by the random search method currently in favor. He is deeply sceptical of the prevailing theories of evolution and the origin of life on Earth.

In 1998, computer scientist Christoph Adami agrees that trouble dogs the marriage of biology and logical entropy. In Introduction to Artificial Life (27), he comments on "the decades of confusion that have reigned over the treatment of living systems from the point of view of thermodynamics and information theory..." (p 59). He says, "information is always shared between two ensembles" (p 70), a restriction that sounds promising. Yet in his section entitled "Second Law of Thermodynamics," he says that as a thermodynamic system is put into contact with another one at a lower temperature, and thermal equilibrium is reached, the total entropy of the combined ensemble "stays constant" (p 99). This flatly contradicts the second law. Later, applying the second law to information, he explains that only the "conditional entropy" increases in such examples. "The unconditional (or marginal) entropy � given by conditional entropy plus mutual entropy... stays constant" (p 118, Adami's italics). More new kinds of entropy.

In 1999's The Fifth Miracle (28), theoretical physicist and science writer Paul Davies devotes a chapter, "Against the Tide," to the relationship between entropy and biology. In an endnote to that chapter he writes, " 'higher' organisms have higher (not lower) algorithmic entropy..." (p 277, Davies' italics) � another reversal of the usual understanding. He concludes, "The source of biological information, then, is the organism's environment" (p 57). Later, "Gravitationally induced instability is a source of information" (p 63). But this "still leaves us with the problem.... How has meaningful information emerged in the universe?" (p 65). He has no answer for this question.

The Touchstone of Life (1999) follows Prigogine's course, relying on Boltzmann's constant to link thermodynamic and logical entropy (29). Author Werner Loewenstein often strikes the chords that accompany deep understanding. "As for the origin of information, the fountainhead, this must lie somewhere in the territory close to the big bang" (p 25). "Evidently a little bubbling, whirling and seething goes a long way in organizing matter.... That understanding has led to the birth of a new alchemy..." (p 48-49). Exactly.

Conclusion

In my opinion, the audacious attempt to reveal the formal equivalence of the ideas of biological organization and thermodynamic order ...must be judged to have failed. � Peter Medawar (30)

Computer scientist Rolf Landauer wrote an article published in June, 1996, which contains insight that should discourage attempts to physically link the two kinds of entropy. He demonstrates that "there is no unavoidable minimal energy requirement per transmitted bit" (31). Using Boltzmann's constant to tie together thermodynamic entropy and logical entropy is thus shown to be without basis. One may rightly object that the minimal energy requirement per bit of information is unrelated to logical entropy. But this supposed minimum energy requirement was the keystone of arguments connecting the two concepts.

It is surprising that mixing entropy and biology still fosters confusion. The relevant concepts from physics pertaining to the second law of thermodynamics are at least 100 years old. The confusion can be eradicated if we distinguish thermodynamic from logical entropy, and admit that Earth's biological system is open to organizing input from outside.

What'sNEW

Todd L. Duncan and Jack S. Semura, "The Deep Physics Behind the Second Law: Information and Energy As Independent Forms of Bookkeeping" [abstract], p 21-29 v 6, Entropy, Mar 2004.
22 Mar 2004: Wolfram quote
Dr. Attila Grandpierre, Konkoly Observatory, Hungary replies, 22 Jan 2004
Andreas Greven et al., eds., Entropy, ISBN: 0-691-11338-6 [promo] [Chapter 1.pdf], Princeton University Press, 2003. "We hope that these seemingly mysterious relations become clearer by reading through this book."
The Adjacent Possible � Stuart Kauffman talks about "the need for a theory of organization," n 127, Edge, 3 Nov 2003
Is Intelligence a Biological Imperative?: Part IV, of a forum entitled, "The Drake Equation Revisited," held in Palo Alto, CA, 26 August 2003. In the discussion between Peter Ward and David Grinspoon, the latter invokes non-equilibrium thermodynamics to enable life to decrease its logical entropy.
Henry Gee, "Progressive evolution: Aspirational thinking," p 611 v 420, Nature, 12 Dec 2002. "Progressive evolution... stems from a profoundly idealistic, pre-evolutionary view of life." Gee agrees with Ruse.
"Claude Shannon, Mathematician, Dies at 84," The New York Times, 27 February 2001.
2000, November 23: Monad to Man, by Michael Ruse, about evolutionary progress.
2000, June 12: Ernst Mayr does not doubt evolutionary progress.

References

1. Harold J. Morowitz, Beginnings of Cellular Life: Metabolism Recapitulates Biogenesis, Yale University Press, 1992. p 69.
2. Thomas Kuhn, Black-Body Theory and the Quantum Discontinuity, 1894-1912, The University of Chicago Press, 1978. p 13.
3. Richard P. Feynman, Robert B. Leighton and Matthew Sands, The Feynman Lectures on Physics, v I; Reading, Massachusetts: Addison-Wesley Publishing Company, 1963. section 44-3.
4. Harvey S. Leff and Andrew F. Rex, Maxwell's Demon: Entropy, Information, Computing, Princeton University Press, 1990. p 6.
5. For technical discussions of this difference see The Maximum Entropy Formalism, Raphael D. Levine and Myron Tribus, eds., The MIT Press, 1979.
6. Myron Tribus and Edward C. McIrvine. "Energy and Information," p 179-188 v 225, Scientific American, September, 1971.
7. Richard P. Feynman, Robert B. Leighton and Matthew Sands, The Feynman Lectures on Physics, v I; Reading, Massachusetts: Addison-Wesley Publishing Company, 1963.
7.5. Peter Coveney and Roger Highfield, The Arrow of Time, Ballentine Books, 1990. p 176-177.
8. C. E. Shannon, "A Mathematical Theory of Communication" p 379-423 and 623-656, v 27, The Bell System Technical Journal, July, October, 1948. PostScript and pdf reprints are available.
9. Ernst Mayr, Toward a New Philosophy of Biology, Harvard University Press, 1988. p 251.
10. Charles Darwin, On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life. London: John Murray, Albemarle Street, 1859.
11. Lynn Margulis and Dorion Sagan, What Is Life? Simon and Schuster, 1995. p 44.
12. Stephen Jay Gould, [interviewed in] The Third Culture, by John Brockman, Simon and Schuster, 1995. p 52.
13. Richard Dawkins, [interviewed in] The Third Culture by John Brockman, Simon and Schuster, 1995. p 84.
14. John Maynard Smith and E�rs Szathm�ry, The Major Transitions in Evolution, W.H. Freeman and Company Limited, 1995. p 4 (title of chapter 1.2).
15. Theodosius Dobzhansky, Studies in the Philosophy of Biology: Reduction and Related Problems, Francisco J. Ayala and Theodosius Dobzhansky, eds. University of California Press, 1974. p 311.
16. Edward O. Wilson, The Diversity of Life, Harvard University Press, 1992. p 187.
17. Philip Kitcher, Abusing Science, The MIT Press, 1982. p 90.
18. Tim M. Berra, Evolution and the Myth of Creationism: A Basic Guide to the Facts in the Evolution Debate, Stanford University Press, 1990. p 126.
19. Charles B. Thaxton, Walter L. Bradley and Roger L. Olsen, The Mystery of Life's Origin: Reassessing Current Theories, New York: Philosophical Library, 1984. p 136-142. The website has three online chapters.
19.5. Jeffrey S. Wicken, Evolution, Thermodynamics and Information: Extending the Darwinian Program, Oxford University Press, 1987. p 59.
20. Roger Penrose, The Emperor's New Mind, Oxford University Press, 1989. p 314.
21. Martin Goldstein and Inge F. Goldstein, The Refrigerator and the Universe: Understanding the Laws of Energy, Harvard University Press, 1993.
22. Daniel R. Brooks and E. O. Wiley, Evolution as Entropy, second edition; The University of Chicago Press, 1988. p 37-38.
23. Robert Rosen, Life Itself: A Comprehensive Inquiry Into the Nature, Origin and Fabrication of Life, Columbia University Press, 1991. p 114.
23.5. Hans Christian von Baeyer, Maxwell's Demon: Why Warmth Disperses and Time Passes [review by Rolf Landauer], Random House, 1998. p 165.
24. Ilya Prigogine, From Being To Becoming, New York: W. H. Freeman and Company, 1980. p 123.
25. John Horgan, "From Complexity to Perplexity," p 104-109, Scientific American June 1995.
26. Hubert P. Yockey, Information theory and molecular biology, Cambridge University Press, 1992.
27. Christoph Adami, Introduction to Artificial Life, Telos (Springer-Verlag), 1998.
28. Paul Davies, The Fifth Miracle, Simon and Schuster, 1999.
29. Werner R. Loewenstein, The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life, Oxford University Press, 1999.
30. Peter Medawar, Pluto's Republic, Oxford University Press, 1984. p 226.
31. Rolf Landauer, "Minimal Energy Requirements in Communication" p 1914-1918 v 272 Science, 28 June 1996.