Why is naturalism considered scientific and creationism is not ?

Don't tell the evolutionists this is not evolution or getting more complex this is simply a programmed cycle.

Since the evolutionists didn't bring up the 2nd law, your comment is rightly ridiculed. :lol:

Clearly you have given no example of my view being wrong concerning the 2nd law.
yes he has ever time you have mentioned it.
but ass always your not man enough to admit you're wrong..

here's more links proving you wrong..
A superior list of links to scientific views of evolution as well as to the whole spectrum of creationist sites is in http://www.talkorigins.org/origins/other-links.html

A scientific analysis of the problems creationists face in asserting that the second law is somehow an obstacle to evolution is http://www.talkorigins.org/faqs/thermo/probability.html

A brief, but very substantive, response to creationists’ attributing false implications to thermodynamics is http://www.talkorigins.org/faqs/thermo/creationism.html

A well-reasoned summary, "The Second Law of Thermodynamics in the Context of the Christian Faith" by a chemical physicist, who is also a devout Christian is in his Web page http://steamdoc.s5.com

For info from NASA on substances in space: http://www.astrochem.org
 
Last edited:
Don't tell the evolutionists this is not evolution or getting more complex this is simply a programmed cycle.

Since the evolutionists didn't bring up the 2nd law, your comment is rightly ridiculed. :lol:

Clearly you have given no example of my view being wrong concerning the 2nd law.

Your side has claimed evolution can't happen, because the 2nd Law says things tend toward less energy, less order. Clearly, as even photosynthesis shows, with added energy, more ordered, more complex, higher energy molecules can form.

Wasn't that your sides view of the 2nd Law?
If not, please explain what your view of the 2nd Law is.
 
Since the evolutionists didn't bring up the 2nd law, your comment is rightly ridiculed. :lol:

Clearly you have given no example of my view being wrong concerning the 2nd law.

Your side has claimed evolution can't happen, because the 2nd Law says things tend toward less energy, less order. Clearly, as even photosynthesis shows, with added energy, more ordered, more complex, higher energy molecules can form.

Wasn't that your sides view of the 2nd Law?
If not, please explain what your view of the 2nd Law is.

Thank me for providing you ammunition and making my job harder. :lol:

I believe I stated it poorly in the past posts. Creationists view is not that it prevents an increase in order that things actually have a normal tendency to go towards disorder.

The earth is a closed system yes but the sun produces forces on the earth that produce entropy. The energy from the sun needs to be organized to decrease entropy or it will be random and increase entropy.

Now do you understand how random mutations would greatly affect evolution or the origins of life for that matter.
 
Clearly you have given no example of my view being wrong concerning the 2nd law.

Your side has claimed evolution can't happen, because the 2nd Law says things tend toward less energy, less order. Clearly, as even photosynthesis shows, with added energy, more ordered, more complex, higher energy molecules can form.

Wasn't that your sides view of the 2nd Law?
If not, please explain what your view of the 2nd Law is.

Thank me for providing you ammunition and making my job harder. :lol:

I believe I stated it poorly in the past posts. Creationists view is not that it prevents an increase in order that things actually have a normal tendency to go towards disorder.

The earth is a closed system yes but the sun produces forces on the earth that produce entropy. The energy from the sun needs to be organized to decrease entropy or it will be random and increase entropy.

Now do you understand how random mutations would greatly affect evolution or the origins of life for that matter.

Creationists view is not that it prevents an increase in order that things actually have a normal tendency to go towards disorder.

Without adding energy, they do.
So what? Why does that make evolution impossible?
Because energy is added to Earth, after all.

The earth is a closed system

No, it really isn't. And you should stop confusing closed and isolated.
It makes your "point" muddled.

but the sun produces forces on the earth that produce entropy.

Please explain further, in your own words.

The energy from the sun needs to be organized to decrease entropy or it will be random and increase entropy.

Your continued use of the word "organized" has nothing to do with your attempt to use physics in your argument. You should stop using it.

Now do you understand how random mutations would greatly affect evolution

I should think it was obvious that mutations are part of evolution.
Do you suppose anyone on "my side" of this issue has ever claimed otherwise?
If so, where?
If not, do you feel you're scoring some point here?
 
Todd

Just adding energy does not produce order. Evolution is a solely a random process what happens with random processes in a system ? it will increase entropy. Yes energy is added to earth but unusable energy can and does produce harm.

Sorry once again I called earth a closed system when it is actually an open system.

Uv rays damage skin and our eyes.It can cause cancer and cause harmful mutations. The suns energy also causes oxidation on paint etc. Not all energy from the sun is organized and this is some of the examples of damage caused by non organized energy.

Wrong because unuasble energy which is unorganized, can't do work and that unusable energy increases entropy.

Random processes, such as random mutations,will increase entropy and disorder not order and complexity. The point is random mutations do not do as your side claims.
 
Last edited:
Todd

Just adding energy does not produce order. Evolution is a solely a random process what happens with random processes in a system ? it will increase entropy. Yes energy is added to earth but unusable energy can and does produce harm.

Sorry once again I called earth a closed system when it is actually an open system.

Uv rays damage skin and our eyes.It can cause cancer and cause harmful mutations. The suns energy also causes oxidation on paint etc. Not all energy from the sun is organized and this is some of the examples of damage caused by non organized energy.

Wrong because unuasble energy which is unorganized, can't do work and that unusable energy increases entropy.

Random processes, such as random mutations,will increase entropy and disorder not order and complexity. The point is random mutations do not do as your side claims.

Just adding energy does not produce order.

In the case of photosynthesis, it certainly does.

Evolution is a solely a random process what happens with random processes in a system ?

It apparently turned bacteria into you. The effort may have been wasted.

Not all energy from the sun is organized

None of the energy from the sun is organized.

Random processes, such as random mutations,will increase entropy and disorder not order and complexity.


That's an interesting claim.

The point is random mutations do not do as your side claims.

That's another one.
 
Todd

Just adding energy does not produce order. Evolution is a solely a random process what happens with random processes in a system ? it will increase entropy. Yes energy is added to earth but unusable energy can and does produce harm.

Except that evolution is not a random process, because its primary forcing, natural selection, is not random.
 
Todd

Just adding energy does not produce order. Evolution is a solely a random process what happens with random processes in a system ? it will increase entropy. Yes energy is added to earth but unusable energy can and does produce harm.

Except that evolution is not a random process, because its primary forcing, natural selection, is not random.

Copying errors and breaks in Dna strands are not random ?

Natural selection can produce both complexity or a less adapted organisms and that is not random ?
 
Todd

Just adding energy does not produce order. Evolution is a solely a random process what happens with random processes in a system ? it will increase entropy. Yes energy is added to earth but unusable energy can and does produce harm.

Sorry once again I called earth a closed system when it is actually an open system.

Uv rays damage skin and our eyes.It can cause cancer and cause harmful mutations. The suns energy also causes oxidation on paint etc. Not all energy from the sun is organized and this is some of the examples of damage caused by non organized energy.

Wrong because unuasble energy which is unorganized, can't do work and that unusable energy increases entropy.

Random processes, such as random mutations,will increase entropy and disorder not order and complexity. The point is random mutations do not do as your side claims.

Just adding energy does not produce order.

In the case of photosynthesis, it certainly does.

Evolution is a solely a random process what happens with random processes in a system ?

It apparently turned bacteria into you. The effort may have been wasted.

Not all energy from the sun is organized

None of the energy from the sun is organized.

Random processes, such as random mutations,will increase entropy and disorder not order and complexity.


That's an interesting claim.

The point is random mutations do not do as your side claims.

That's another one.

That would seem to be evidence for purposeful design. where would we be without plants and the sun ?

Bacteria in to me,you can prove this how ?

You're now contradicting yourself. You just pointed to photosynthesis as evidence if that energy was not ordered so it could do work ,it would also harm the plants.

Would you like to compare mutations by the numbers. All the beneficial mutations you can confirm vs the harmful mutations that can be confirmed ?
 
Last edited:
Todd

Just adding energy does not produce order. Evolution is a solely a random process what happens with random processes in a system ? it will increase entropy. Yes energy is added to earth but unusable energy can and does produce harm.

Sorry once again I called earth a closed system when it is actually an open system.

Uv rays damage skin and our eyes.It can cause cancer and cause harmful mutations. The suns energy also causes oxidation on paint etc. Not all energy from the sun is organized and this is some of the examples of damage caused by non organized energy.

Wrong because unuasble energy which is unorganized, can't do work and that unusable energy increases entropy.

Random processes, such as random mutations,will increase entropy and disorder not order and complexity. The point is random mutations do not do as your side claims.

Just adding energy does not produce order.

In the case of photosynthesis, it certainly does.

Evolution is a solely a random process what happens with random processes in a system ?

It apparently turned bacteria into you. The effort may have been wasted.

Not all energy from the sun is organized

None of the energy from the sun is organized.

Random processes, such as random mutations,will increase entropy and disorder not order and complexity.


That's an interesting claim.

The point is random mutations do not do as your side claims.

That's another one.

That would seem to be evidence for purposeful design. where would we be without plants and the sun ?

Bacteria in to me,you can prove this how ?

You're now contradicting yourself. You just pointed to photosynthesis as evidence if that energy was not ordered so it could do work ,it would also harm the plants.

Would you like to compare mutations by the numbers. All the beneficial mutations you can confirm vs the harmful mutations that can be confirmed ?

The argument for design is very similar to the argument for a finely tuned universe. The problem with both is that, for instance, there is no evidence that the universe is finely tuned for life and plenty for just the opposite, that life is finely tuned for the universe, having evolved in order to adapt to it. Not only is there no evidence for a finely tuned universe, if it has a divine designer, he should give up his day job, because, damn.

I mean, what engineer in his right mind would create an entertainment center right next to a sewer system?
 
Todd

Just adding energy does not produce order. Evolution is a solely a random process what happens with random processes in a system ? it will increase entropy. Yes energy is added to earth but unusable energy can and does produce harm.

Sorry once again I called earth a closed system when it is actually an open system.

Uv rays damage skin and our eyes.It can cause cancer and cause harmful mutations. The suns energy also causes oxidation on paint etc. Not all energy from the sun is organized and this is some of the examples of damage caused by non organized energy.

Wrong because unuasble energy which is unorganized, can't do work and that unusable energy increases entropy.

Random processes, such as random mutations,will increase entropy and disorder not order and complexity. The point is random mutations do not do as your side claims.

Just adding energy does not produce order.

In the case of photosynthesis, it certainly does.

Evolution is a solely a random process what happens with random processes in a system ?

It apparently turned bacteria into you. The effort may have been wasted.

Not all energy from the sun is organized

None of the energy from the sun is organized.

Random processes, such as random mutations,will increase entropy and disorder not order and complexity.


That's an interesting claim.

The point is random mutations do not do as your side claims.

That's another one.

That would seem to be evidence for purposeful design. where would we be without plants and the sun ?

Bacteria in to me,you can prove this how ?

You're now contradicting yourself. You just pointed to photosynthesis as evidence if that energy was not ordered so it could do work ,it would also harm the plants.

Would you like to compare mutations by the numbers. All the beneficial mutations you can confirm vs the harmful mutations that can be confirmed ?

That would seem to be evidence for purposeful design.

You skipped a few steps there. LOL!

where would we be without plants and the sun ?

Hungry and in the dark?

You're now contradicting yourself.

No, just contradicting you.

You just pointed to photosynthesis as evidence if that energy was not ordered so it could do work

ESL? Photosynthesis shows you don't understand the 2nd Law.

Would you like to compare mutations by the numbers.

As soon as you admit your 2nd Law error, we can try to move the discussion forward.
 
Just adding energy does not produce order.

In the case of photosynthesis, it certainly does.

Evolution is a solely a random process what happens with random processes in a system ?

It apparently turned bacteria into you. The effort may have been wasted.

Not all energy from the sun is organized

None of the energy from the sun is organized.

Random processes, such as random mutations,will increase entropy and disorder not order and complexity.


That's an interesting claim.

The point is random mutations do not do as your side claims.

That's another one.

That would seem to be evidence for purposeful design. where would we be without plants and the sun ?

Bacteria in to me,you can prove this how ?

You're now contradicting yourself. You just pointed to photosynthesis as evidence if that energy was not ordered so it could do work ,it would also harm the plants.

Would you like to compare mutations by the numbers. All the beneficial mutations you can confirm vs the harmful mutations that can be confirmed ?

That would seem to be evidence for purposeful design.

You skipped a few steps there. LOL!

where would we be without plants and the sun ?

Hungry and in the dark?

You're now contradicting yourself.

No, just contradicting you.

You just pointed to photosynthesis as evidence if that energy was not ordered so it could do work

ESL? Photosynthesis shows you don't understand the 2nd Law.

Would you like to compare mutations by the numbers.

As soon as you admit your 2nd Law error, we can try to move the discussion forward.

I understand it you however don't understand photosynthesis.

The Mystery of Life's Origin:
Reassessing Current Theories


CHAPTER 7
Thermodynamics of Living Systems
It is widely held that in the physical sciences the laws of thermodynamics have had a unifying effect similar to that of the theory of evolution in the biological sciences. What is intriguing is that the predictions of one seem to contradict the predictions of the other. The second law of thermodynamics suggests a progression from order to disorder, from complexity to simplicity, in the physical universe. Yet biological evolution involves a hierarchical progression to increasingly complex forms of living systems, seemingly in contradiction to the second law of thermodynamics. Whether this discrepancy between the two theories is only apparent or real is the question to be considered in the next three chapters. The controversy which is evident in an article published in the American Scientist 1 along with the replies it provoked demonstrates the question is still a timely one.
The First Law of Thermodynamics
Thermodynamics is an exact science which deals with energy. Our world seethes with transformations of matter and energy. Be these mechanical or chemical, the first law of thermodynamics---the principle of the Conservation of Energy---tells us that the total energy of the universe or any isolated part of it will be the same after any such transformation as it was before. A major part of the science of thermodynamics is accounting---giving an account of the energy of a system that has undergone some sort of transformation. Thus, we derive from the first law of thermodynamics that the change in the energy of a system (E) is equal to the work done on (or by) the system (W) and the heat flow into (or out of) the system (Q) Mechanical work and energy are interchangeable, i.e., energy may be converted into mechanical work as in a steam engine, or mechanical work can be converted into energy as in the heating of a cannon which occurs as its barrel is bored. In mathematical terms (where the terms are as previously defined):

E = Q + W (7-1)
The Second Law of Thermodynamics
The second law of thermodynamics describes the flow of energy in nature in processes which are irreversible. The physical significance of the second law of thermodynamics is that the energy flow in such processes is always toward a more uniform distribution of the energy of the universe. Anyone who has had to pay utility bills for long has become aware that too much of the warm air in his or her home during winter escapes to the outside. This flow of energy from the house to the cold outside in winter, or the flow of energy from the hot outdoors into the air-conditioned home in the summer, is a process described by the second law of thermodynamics. The burning of gasoline, converting energy "rich" compounds (hydrocarbons) into energy "lean" compounds, carbon dioxide (CO2) and water (H20), is a second illustration of this principle.

The concept of entropy (S) gives us a more quantitative way to describe the tendency for energy to flow in a particular direction. The entropy change for a system is defined mathematically as the flow of energy divided by the temperature, or,

S [Q / T] (7-2)

where S is the change in entropy, Q is the heat flow into or out of a system, and T is the absolute temperature in degrees Kelvin (K).

[Note: For a reversible flow of energy such as occurs under equilibrium conditions, the equality sign applies. For irreversible energy flow, the inequality applies.]

A Driving Force

If we consider heat flow from a warm house to the outdoors on a cold winter night, we may apply equation 7-2 as follows:

ST = Shouse + Soutdoors - Q / T1 + Q / T2 (7-3)

where Sr is the total entropy change associated with this irreversible heat flow, T1 is the temperature inside the house, and T2 is the temperature outdoors. The negative sign of the first term notes loss of heat from the house, while the positive sign on the second term recognizes heat gained by the outdoors. Since it is warmer in the house than outdoors (T1 > T2), the total entropy will increase (Sr > 0) as a result of this heat flow. If we turn off the heater in the house, it will gradually cool until the temperature approaches that of the outdoors, i.e., T1 = T2. When this occurs, the entropy change (S) associated with heat flow (Q) goes to zero. Since there is no further driving force for heat flow to the outdoors, it ceases; equilibrium conditions have been established.

As this simple example shows, energy flow occurs in a direction that causes the total energy to be more uniformly distributed. If we think about it, we can also see that the entropy increase associated with such energy flow is proportional to the driving force for such energy flow to occur. The second law of thermodynamics says that the entropy of the universe (or any isolated system therein) is increasing; i.e., the energy of the universe is becoming more uniformly distributed.

It is often noted that the second law indicates that nature tends to go from order to disorder, from complexity to simplicity. If the most random arrangement of energy is a uniform distribution, then the present arrangement of the energy in the universe is nonrandom, since some matter is very rich in chemical energy, some in thermal energy, etc., and other matter is very poor in these kinds of energy. In a similar way, the arrangements of mass in the universe tend to go from order to disorder due to the random motion on an atomic scale produced by thermal energy. The diffusional processes in the solid, liquid, or gaseous states are examples of increasing entropy due to random atomic movements. Thus, increasing entropy in a system corresponds to increasingly random arrangements of mass and/or energy.

Entropy and Probability

There is another way to view entropy. The entropy of a system is a measure of the probability of a given arrangement of mass and energy within it. A statistical thermodynamic approach can be used to further quantify the system entropy. High entropy corresponds to high probability. As a random arrangement is highly probable, it would also be characterized by a large entropy. On the other hand, a highly ordered arrangement, being less probable, would represent a lower entropy configuration. The second law would tell us then that events which increase the entropy of the system require a change from more order to less order, or from less-random states to more-random states. We will find this concept helpful in Chapter 9 when we analyze condensation reactions for DNA and protein.

Clausius2, who formulated the second law of thermodynamics, summarizes the laws of thermodynamics in his famous concise statement: "The energy of the universe is constant; the entropy of the universe tends toward a maximum." The universe moves from its less probable current arrangement (low entropy) toward its most probable arrangement in which the energy of the universe will be more uniformly distributed.
Life and the Second Law of Thermodynamics
How does all of this relate to chemical evolution? Since the important macromolecules of living systems (DNA, protein, etc.) are more energy rich than their precursors (amino acids, heterocyclic bases, phosphates, and sugars), classical thermodynamics would predict that such macromolecules will not spontaneously form.

Roger Caillois has recently drawn this conclusion in saying, "Clausius and Darwin cannot both be right."3 This prediction of classical thermodynamics has, however, merely set the stage for refined efforts to understand life's origin. Harold Morowitz4 and others have suggested that the earth is not an isolated system, since it is open to energy flow from the sun. Nevertheless, one cannot simply dismiss the problem of the origin of organization and complexity in biological systems by a vague appeal to open-system non-equilibrium thermodynamics. The mechanisms responsible for the emergence and maintenance of coherent (organized) states must be defined. To clarify the role of mass and energy flow through a system as a possible solution to this problem, we will look in turn at the thermodynamics of (1) an isolated system, (2) a closed system, and (3) an open system. We will then discuss the application of open-system thermodynamics to living systems. In Chapter 8 we will apply the thermodynamic concepts presented in this chapter to the prebiotic synthesis of DNA and protein. In Chapter 9 this theoretical analysis will be used to interpret the various prebiotic synthesis experiments for DNA and protein, suggesting a physical basis for the uniform lack of success in synthesizing these crucial components for living cells.

Isolated Systems

An isolated system is one in which neither mass nor energy flows in or out. To illustrate such a system, think of a perfectly insulated thermos bottle (no heat loss) filled initially with hot tea and ice cubes. The total energy in this isolated system remains constant but the distribution of the energy changes with time. The ice melts and the energy becomes more uniformly distributed in the system. The initial distribution of energy into hot regions (the tea) and cold regions (the ice) is an ordered, nonrandom arrangement of energy, one not likely to be maintained for very long. By our previous definition then, we may say that the entropy of the system is initially low but gradually increases with time. Furthermore, the second law of thermodynamics says the entropy of the system will continue to increase until it attains some maximum value, which corresponds to the most probable state for the system, usually called equilibrium.

In summary, isolated systems always maintain constant total energy while tending toward maximum entropy, or disorder. In mathematical terms,

E / t = 0

(isolated system)

S / t 0 (7-4)

where E and S are the changes in the system energy and system entropy respectively, for a time interval t. Clearly the emergence of order of any kind in an isolated system is not possible. The second law of thermodynamics says that an isolated system always moves in the direction of maximum entropy and, therefore, disorder.

It should be noted that the process just described is irreversible in the sense that once the ice is melted, it will not reform in the thermos. As a matter of fact, natural decay and the general tendency toward greater disorder are so universal that the second law of thermodynamics has been appropriately dubbed "time's arrow."5

Closed Systems near Equilibrium

A closed system is one in which the exchange of energy with the outside world is permitted but the exchange of mass is not. Along the boundary between the closed system and the surroundings, the temperature may be different from the system temperature, allowing energy flow into or out of the system as it moves toward equilibrium. If the temperature along the boundary is variable (in position but not time), then energy will flow through the system, maintaining it some distance from equilibrium. We will discuss closed systems near equilibrium first, followed by a discussion of closed systems removed from equilibrium next.

If we combine the first and second laws as expressed in equations 7-1 and 7-2 and replace the mechanical work term W by P V, where P is pressure and V is volume change, we obtain,

[NOTE: Volume expansion (V> 0) corresponds to the system doing work, and therefore losing energy. Volume contraction
(V 0) corresponds to work being done on the system].

S [E + P V] / [T] (7-5)

Algebraic manipulation gives

E + P V - T S 0 or G 0 (7-6)

where

G = E + P V - T S

The term on the left side of the inequality in equation 7-6 is called the change in the Gibbs free energy (G). It may be thought of as a thermodynamic potential which describes the tendency of a system to change---e.g., the tendency for phase changes, heat conduction, etc. to occur. If a reaction occurs spontaneously, it is because it brings a decrease in the Gibbs free energy (G 0). This requirement is equivalent to the requirement that the entropy of the universe increase. Thus, like an increase in entropy, a decrease in Gibbs free energy simply means that a system and its surroundings are changing in such a way that the energy of the universe is becoming more uniformly distributed.

We may summarize then by noting that the second law of thermodynamics requires,

G / t 0, (closed system) (7-7)

where t indicates the time period during which the Gibbs free energy changed.

The approach to equilibrium is characterized by,

G / t 0, (closed system) (7-8)

The physical significance of equation 7-7 can be understood by rewriting equations 7-6 and 7-7 in the following form:

[S / t] - [ 1 / T (E / t + P V / t)] 0 (7-9)

or

(S / t ) - (1 / T H / t ) 0

and noting that the first term represents the entropy change due to processes going on within the system and the second term represents the entropy change due to exchange of mechanical and/or thermal energy with the surroundings. This simply guarantees that the sum of the entropy change in the system and the entropy change in the surroundings will be greater than zero; i.e., the entropy of the universe must increase. For the isolated system, E + P V = 0 and equation 7-9 reduces to equation 7-4.

A simple illustration of this principle is seen in phase changes such as water transforming into ice. As ice forms, energy (80 calories/gm) is liberated to the surrounding. The change in the entropy of the system as the amorphous water becomes crystalline ice is -0.293 entropy units (eu)/degree Kelvin (K). The entropy change is negative because the thermal and configuration entropy (or disorder) of water is greater than that of ice, which is a highly ordered crystal.

[NOTE: Confirgurational entropy measures randomness in the distribution of matter in much the same way that thermal entropy measures randomness in the distribution of energy].

Thus, the thermodynamic conditions under which water will transform to ice are seen from equation 7-9 to be:

-0.293 - (-80 / T) > 0 (7-l0a)

or

T 273oK (7-l0b)

For condition of T 273oK energy is removed from water to produce ice, and the aggregate disordering of the surroundings is greater than the ordering of the water into ice crystals. This gives a net increase in the entropy of the universe, as predicted by the second law of thermodynamics.

It has often been argued by analogy to water crystallizing to ice that simple monomers may polymerize into complex molecules such as protein and DNA. The analogy is clearly inappropriate, however. The E + P V term (equation 7-9) in the polymerization of important organic molecules is generally positive (5 to 8 kcal/mole), indicating the reaction can never spontaneously occur at or near equilibrium.

[NOTE: If E + P V is positive, the entropy term in eq 7 9 must be negative due to the negative sign which preceeds it. The inequality can only be satisfied by S being sufficiently positive, which implies disordenng].

By contrast the E + P V term in water changing to ice is a negative, -1.44 kcal/mole, indicating the phase change is spontaneous as long as T 273oK, as previously noted. The atomic bonding forces draw water molecules into an orderly crystalline array when the thermal agitation (or entropy driving force, T S) is made sufficiently small by lowering the temperature. Organic monomers such as amino acids resist combining at all at any temperature, however, much less in some orderly arrangement.

Morowitz6 has estimated the increase in the chemical bonding energy as one forms the bacterium Escherichia coli from simple precursors to be 0.0095 erg, or an average of 0.27 ev/ atom for the 2 x 1010 atoms in a single bacterial cell. This would be thermodynamically equivalent to having water in your bathtub spontaneously heat up to 360oC, happily a most unlikely event. He goes on to estimate the probability of the spontaneous formation of one such bacterium in the entire universe in five billion years under equilibrium conditions to be 10-1011. Morowitz summarizes the significance of this result by saying that "if equilibrium processes alone were at work, the largest possible fluctuation in the history of the universe is likely to have been no longer than a small peptide."7 Nobel Laureate I. Prigogine et al., have noted with reference to the same problem that:

The probability that at ordinary temperatures a macroscopic number of molecules is assembled to give rise to the highly ordered structures and to the coordinated functions characterizing living organisms is vanishingly small. The idea of spontaneous genesis of life in its present form is therefore highly improbable, even on the scale of billions of years during which prebiotic evolution occurred.8

It seems safe to conclude that systems near equilibrium (whether isolated or closed) can never produce the degree of complexity intrinsic in living systems. Instead, they will move spontaneously toward maximizing entropy, or randomness. Even the postulate of long time periods does not solve the problem, as "time's arrow" (the second law of thermodynamics) points in the wrong direction; i.e., toward equilibrium. In this regard, H.F. Blum has observed:

The second law of thermodynamics would have been a dominant directing factor in this case [of chemical evolution]; the reactions involved tending always toward equilibrium, that is, toward less free energy, and, in an inclusive sense, greater entropy. From this point of view the lavish amount of time available should only have provided opportunity for movement in the direction of equilibrium.9 (Emphasis added.)

Thus, reversing "time's arrow" is what chemical evolution is all about, and this will not occur in isolated or closed systems near equilibrium.

The possibilities are potentially more promising, however, if one considers a system subjected to energy flow which may maintain it far from equilibrium, and its associated disorder. Such a system is said to be a constrained system, in contrast to a system at or near equilibrium which is unconstrained. The possibilities for ordering in such a system will be considered next.

Closed Systems Far from Equilibrium

Energy flow through a system is the equivalent to doing work continuously on the system to maintain it some distance from equilibrium. Nicolis and Prigoginelo have suggested that the entropy change (S) in a system for a time interval (t) may be divided into two components.

S = Se + Si (7-11)

where Se is the entropy flux due to energy flow through the system, and Si is the entropy production inside the system due to irreversible processes such as diffusion, heat conduction, heat production, and chemical reactions. We will note when we discuss open systems in the next section that Se includes the entropy flux due to mass flow through the system as well. The second law of thermodynamics requires,

Si 0 (7-12)

In an isolated system, Se = 0 and equations 7-11 and 7-12 give,

S =Si 0 (7-13)

Unlike Si, Se in a closed system does not have a definite sign, but depends entirely on the boundary constraints imposed on the system. The total entropy change in the system can be negative (i.e., ordering within system) when,

Se 0 and | Se | > Si (7-14)

Under such conditions a state that would normally be highly improbable under equilibrium conditions can be maintained indefinitely. It would be highly unlikely (i.e., statistically just short of impossible) for a disconnected water heater to produce hot water. Yet when the gas is connected and the burner lit, the system is constrained by energy flow and hot water is produced and maintained indefinitely as long as energy flows through the system.

An open system offers an additional possibility for ordering---that of maintaining a system far from equilibrium via mass flow through the system, as will be discussed in the next section.

An open system is one which exchanges both energy and mass with the surroundings. It is well illustrated by the familiar internal combustion engine. Gasoline and oxygen are passed through the system, combusted, and then released as carbon dioxide and water. The energy released by this mass flow through the system is converted into useful work; namely, torque supplied to the wheels of the automobile. A coupling mechanism is necessary, however, to allow the released energy to be converted into a particular kind of work. In an analagous way the dissipative (or disordering) processes within an open system can be offset by a steady supply of energy to provide for (S) Se type work. Equation 7-11, applied earlier to closed systems far from equilibrium, may also be applied to open systems. In this case, the Se term represents the negative entropy, or organizing work done on the system as a result of both energy and mass flow through the system. This work done to the system can move it far from equilibrium, maintaining it there as long as the mass and/or energy flow are not interrupted. This is an essential characteristic of living systems as will be seen in what follows.
Thermodynamics of Living Systems
Living systems are composed of complex molecular configurations whose total bonding energy is less negative than that of their chemical precursors (e.g., Morowitz's estimate of E = 0.27 ev/atom) and whose thermal and configurational entropies are also less than that of their chemical precursors. Thus, the Gibbs free energy of living systems (see equation 7-6) is quite high relative to the simple compounds from which they are formed. The formation and maintenance of living systems at energy levels well removed from equilibrium requires continuous work to be done on the system, even as maintenance of hot water in a water heater requires that continuous work be done on the system. Securing this continuous work requires energy and/or mass flow through the system, apart from which the system will return to an equilibrium condition (lowest Gibbs free energy, see equations 7-7 and 7-8) with the decomposition of complex molecules into simple ones, just as the hot water in our water heater returns to room temperature once the gas is shut off.

In living plants, the energy flow through the system is supplied principally by solar radiation. In fact, leaves provide relatively large surface areas per unit volume for most plants, allowing them to "capture" the necessary solar energy to maintain themselves far from equilibrium. This solar energy is converted into the necessary useful work (negative Se in equation 7-11) to maintain the plant in its complex, high-energy configuration by a complicated process called photosynthesis. Mass, such as water and carbon dioxide, also flows through plants, providing necessary raw materials, but not energy. In collecting and storing useful energy, plants serve the entire biological world.

For animals, energy flow through the system is provided by eating high energy biomass, either plant or animal. The breaking down of this energy-rich biomass, and the subsequent oxidation of part of it (e.g., carbohydrates), provides a continuous source of energy as well as raw materials. If plants are deprived of sunlight or animals of food, dissipation within the system will surely bring death. Maintenance of the complex, high-energy condition associated with life is not possible apart from a continuous source of energy. A source of energy alone is not sufficient, however, to explain the origin or maintenance of living systems. The additional crucial factor is a means of converting this energy into the necessary useful work to build and maintain complex living systems from the simple biomonomers that constitute their molecular building blocks.

An automobile with an internal combustion engine, transmission, and drive chain provides the necessary mechanism for converting the energy in gasoline into comfortable transportation. Without such an "energy converter," however, obtaining transportation from gasoline would be impossible. In a similar way, food would do little for a man whose stomach, intestines, liver, or pancreas were removed. Without these, he would surely die even though he continued to eat. Apart from a mechanism to couple the available energy to the necessary work, high-energy biomass is insufficient to sustain a living system far from equilibrium. In the case of living systems such a coupling mechanism channels the energy along specific chemical pathways to accomplish a very specific type of work. We therefore conclude that, given the availability of energy and an appropriate coupling mechanism, the maintenance of a living system far from equilibrium presents no thermodynamic problems.

In mathematical formalism, these concepts may be summarized as follows:

(1) The second law of thermodynamics requires only that the entropy production due to irreversible processes within the system be greater than zero; i.e.,

Si > 0 (7-15)

(2) The maintenance of living systems requires that the energy flow through the system be of sufficient magnitude that the negative entropy production rate (i.e., useful work rate) that results be greater than the rate of dissipation that results from irreversible processes going on within the systems; i.e.,

| Se | > Si (7-16)

(3) The negative entropy generation must be coupled into the system in such a way that the resultant work done is directed toward restoration of the system from the disintegration that occurs naturally and is described by the second law of thermodynamics; i.e.,

- Se = Si (7-17)

where Se and Si refer not only to the magnitude of entropy change but also to the specific changes that occur in the system associated with this change in entropy. The coupling must produce not just any kind of ordering but the specific kind required by the system.

While the maintenance of living systems is easily rationalized in terms of thermodynamics, the origin of such living systems is quite another matter. Though the earth is open to energy flow from the sun, the means of converting this energy into the necessary work to build up living systems from simple precursors remains at present unspecified (see equation 7-17). The "evolution" from biomonomers of to fully functioning cells is the issue. Can one make the incredible jump in energy and organization from raw material and raw energy, apart from some means of directing the energy flow through the system? In Chapters 8 and 9 we will consider this question, limiting our discussion to two small but crucial steps in the proposed evolutionary scheme namely, the formation of protein and DNA from their precursors.

It is widely agreed that both protein and DNA are essential for living systems and indispensable components of every living cell today.11 Yet they are only produced by living cells. Both types of molecules are much more energy and information rich than the biomonomers from which they form. Can one reasonably predict their occurrence given the necessary biomonomers and an energy source? Has this been verified experimentally? These questions will be considered in Chapters 8 and 9.

Thermodynamics of Living Systems
 
Todd.

The Mystery of Life's Origin:
Reassessing Current Theories


CHAPTER 8
Thermodynamics and the Origin of Life
Peter Molton has defined life as "regions of order which use energy to maintain their organization against the disruptive force of entropy."1 In Chapter 7 it has been shown that energy and/or mass flow through a system can constrain it far from equilibrium, resulting in an increase in order. Thus, it is thermodynamically possible to develop complex living forms, assuming the energy flow through the system can somehow be effective in organizing the simple chemicals into the complex arrangements associated with life.

In existing living systems, the coupling of the energy flow to the organizing "work" occurs through the metabolic motor of DNA, enzymes, etc. This is analogous to an automobile converting the chemical energy in gasoline into mechanical torque on the wheels. We can give a thermodynamic account of how life's metabolic motor works. The origin of the metabolic motor (DNA, enzymes, etc.) itself, however, is more difficult to explain thermodynamically, since a mechanism of coupling the energy flow to the organizing work is unknown for prebiological systems. Nicolis and Prigogine summarize the problem in this way:

Needless to say, these simple remarks cannot suffice to solve the problem of biological order. One would like not only to establish that the second law (dSi 0) is compatible with a decrease in overall entropy (dS < 0), but also to indicate the mechanisms responsible for the emergence and maintenance of coherent states.2

Without a doubt, the atoms and molecules which comprise living cells individually obey the laws of chemistry and physics, including the laws of thermodynamics. The enigma is the origin of so unlikely an organization of these atoms and molecules. The electronic computer provides a striking analogy to the living cell. Each component in a computer obeys the laws of electronics and mechanics. The key to the computer's marvel lies, however, in the highly unlikely organization of the parts which harness the laws of electronics and mechanics. In the computer, this organization was specially arranged by the designers and builders and continues to operate (with occasional frustrating lapses) through the periodic maintenance of service engineers.

Living systems have even greater organization. The problem then, that molecular biologists and theoretical physicists are addressing, is how the organization of living systems could have arisen spontaneously. Prigogine et al., have noted:

All these features bring the scientist a wealth of new problems. In the first place, one has systems that have evolved spontaneously to extremely organized and complex forms. Coherent behavior is really the characteristic feature of biological systems.3

In this chapter we will consider only the problem of the origin of living systems. Specifically, we will discuss the arduous task of using simple biomonomers to construct complex polymers such as DNA and protein by means of thermal, electrical, chemical, or solar energy. We will first specify the nature and magnitude of the "work" to be done in building DNA and enzymes.

[NOTE: Work in physics normally refers to force times displacement. In this chapter it refers in a more general way to the change in Gibbs free energy of the system that accompanies the polymerization of monomers into polymers].

In Chapter 9 we will describe the various theoretical models which attempt to explain how the undirected flow of energy through simple chemicals can accomplish the work necessary to produce complex polymers. Then we will review the experimental studies that have been conducted to test these models. Finally we will summarize the current understanding of this subject.

How can we specify in a more precise way the work to be done by energy flow through the system to synthesize DNA and protein from simple biomonomers? While the origin of living systems involves more than the genesis of enzymes and DNA, these components are essential to any system if replication is to occur. It is generally agreed that natural selection can act only on systems capable of replication. This being the case, the formation of a DNA/enzyme system by processes other than natural selection is a necessary (though not sufficient) part of a naturalistic explanation for the origin of life.

[NOTE: A sufficient explanation for the origin of life would also require a model for the formation of other critical cellular components, including membranes, and their assembly].

Order vs. Complexity in the Question of Information
Only recently has it been appreciated that the distinguishing feature of living systems is complexity rather than order.4 This distinction has come from the observation that the essential ingredients for a replicating system---enzymes and nucleic acids---are all information-bearing molecules. In contrast, consider crystals. They are very orderly, spatially periodic arrangements of atoms (or molecules) but they carry very little information. Nylon is another example of an orderly, periodic polymer (a polyamide) which carries little information. Nucleic acids and protein are aperiodic polymers, and this aperiodicity is what makes them able to carry much more information. By definition then, a periodic structure has order. An aperiodic structure has complexity. In terms of information, periodic polymers (like nylon) and crystals are analogous to a book in which the same sentence is repeated throughout. The arrangement of "letters" in the book is highly ordered, but the book contains little information since the information presented---the single word or sentence---is highly redundant.

It should be noted that aperiodic polypeptides or polynucleotides do not necessarily represent meaningful information or biologically useful functions. A random arrangement of letters in a book is aperiodic but contains little if any useful information since it is devoid of meaning.

[NOTE: H.P. Yockey, personal communication, 9/29/82. Meaning is extraneous to the sequence, arbitrary, and depends on some symbol convention. For example, the word "gift," which in English means a present and in German poison, in French is meaningless].

Only certain sequences of letters correspond to sentences, and only certain sequences of sentences correspond to paragraphs, etc. In the same way only certain sequences of amino acids in polypeptides and bases along polynucleotide chains correspond to useful biological functions. Thus, informational macro-molecules may be described as being and in a specified sequence.5 Orgel notes:

Living organisms are distinguished by their specified complexity. Crystals fail to qualify as living because they lack complexity; mixtures of random polymers fail to qualify because they lack specificity.6

Three sets of letter arrangements show nicely the difference between order and complexity in relation to information:

1. An ordered (periodic) and therefore specified arrangement:

THE END THE END THE END THE END

Example: Nylon, or a crystal.

[NOTE: Here we use "THE END" even though there is no reason to suspect that nylon or a crystal would carry even this much information. Our point, of course, is that even if they did, the bit of information would be drowned in a sea of redundancy].


2. A complex (aperiodic) unspecified arrangement:

AGDCBFE GBCAFED ACEDFBG

Example: Random polymers (polypeptides).

3. A complex (aperiodic) specified arrangement:

THIS SEQUENCE OF LETTERS CONTAINS A MESSAGE!

Example: DNA, protein.

Yockey7 and Wickens5 develop the same distinction, that "order" is a statistical concept referring to regularity such as could might characterize a series of digits in a number, or the ions of an inorganic crystal. On the other hand, "organization" refers to physical systems and the specific set of spatio-temporal and functional relationships among their parts. Yockey and Wickens note that informational macromolecules have a low degree of order but a high degree of specified complexity. In short, the redundant order of crystals cannot give rise to specified complexity of the kind or magnitude found in biological organization; attempts to relate the two have little future.
Information and Entropy
There is a general relationship between information and entropy. This is fortunate because it allows an analysis to be developed in the formalism of classical thermodynamics, giving us a powerful tool for calculating the work to be done by energy flow through the system to synthesize protein and DNA (if indeed energy flow is capable of producing information). The information content in a given sequence of units, be they digits in a number, letters in a sentence, or amino acids in a polypeptide or protein, depends on the minimum number of instructions needed to specify or describe the structure. Many instructions are needed to specify a complex, information-bearing structure such as DNA. Only a few instructions are needed to specify an ordered structure such as a crystal. In this case we have a description of the initial sequence or unit arrangement which is then repeated ad infinitum according to the packing instructions.

Orgel9 illustrates the concept in the following way. To describe a crystal, one would need only to specify the substance to be used and the way in which the molecules were to be packed together. A couple of sentences would suffice, followed by the instructions "and keep on doing the same," since the packing sequence in a crystal is regular. The description would be about as brief as specifying a DNA-like polynucleotide with a random sequence. Here one would need only to specify the proportions of the four nucleotides in the final product, along with instructions to assemble them randomly. The chemist could then make the polymer with the proper composition but with a random sequence.

It would be quite impossible to produce a correspondingly simple set of instructions that would enable a chemist to synthesize the DNA of an E. coli bacterium. In this case the sequence matters. Only by specifying the sequence letter-by-letter (about 4,000,000 instructions) could we tell a chemist what to make. Our instructions would occupy not a few short sentences, but a large book instead!

Brillouin,10 Schrodinger,11 and others12 have developed both qualitative and quantitative relationships between information and entropy. Brillouin,13 states that the entropy of a system is given by

S = k ln (8-1)

where S is the entropy of the system, k is Boltzmann's constant, and corresponds to the number of ways the energy and mass in a system may be arranged.

We will use Sth and Sc to refer to the thermal and configurational entropies, respectively. Thermal entropy, Sth, is associated with the distribution of energy in the system. Configurational entropy Sc is concerned only with the arrangement of mass in the system, and, for our purposes, we shall be especially interested in the sequencing of amino acids in polypeptides (or proteins) or of nucleotides in polynucleotides (e.g., DNA). The symbols th and c refer to the number of ways energy and mass, respectively, may be arranged in a system.

Thus we may be more precise by writing

S = k lnth c = k lnth + k lnc = Sth + Sc (8-2A)

where

Sth = k lnth (8-2b)

and

Sc = k lnc (8-2c)

Determining Information: From a Random Polymer to an Informed Polymer

If we want to convert a random polymer into an informational molecule, we can determine the increase in information (as defined by Brillouin) by finding the difference between the negatives of the entropy states for the initial random polymer and the informational molecule:

I = - (Scm - Scr) (8-3A),

I = Scr - Scm (8-3b),

= k lncr - k lncm (8-3c)

In this equation, I is a measure of the information content of an aperiodic (complex) polymer with a specified sequence, Scm represents the configurational "coding" entropy of this polymer informed with a given message, and Scr represents the configurational entropy of the same polymer for an unspecified or random sequence.

[NOTE: Yockey and Wickens define information slightly differently than Brilloum, whose definition we use in our analysis. The difference is unimportant insofar as our analysis here is concerned].

Note that the information in a sequence-specified polymer is maximized when the mass in the molecule could be arranged in many different ways, only one of which communicates the intended message. (There is a large Scr from eq. 8-2c since cr is large, yet Scm = 0 from eq. 8-2c since cm = 1.) The information carried in a crystal is small because Sc is small (eq. 8-2c) for a crystal. There simply is very little potential for information in a crystal because its matter can be distributed in so few ways. The random polymer provides an even starker contrast. It bears no information because Scr, although large, is equal to Scm (see eq. 8-3b).

In summary, equations 8-2c and 8-3c quantify the notion that only specified, aperiodic macromolecules are capable of carrying the large amounts of information characteristic of living systems. Later we will calculate "c" for both random and specified polymers so that the configurational entropy change required to go from a random to a specified polymer can be determined. In the next section we will consider the various components of the total work required in the formation of macromolecules such as DNA and protein.
DNA and Protein Formation:
Defining the Work

There are three distinct components of work to be done in assembling simple biomonomers into a complex (or aperiodic) linear polymer with a specified sequence as we find in DNA or protein. The change in the Gibbs free energy, G, of the system during polymerization defines the total work that must be accomplished by energy flow through the system. The change in Gibbs free energy has previously been shown to be

G = E + P V - T S (8-4a)

or

G = H - T S (8-4b)

where a decrease in Gibbs free energy for a given chemical reaction near equilibrium guarantees an increase in the entropy of the universe as demanded by the second law of thermodynamics.

Now consider the components of the Gibbs free energy (eq. 8-4b) where the change in enthalpy (H) is principally the result of changes in the total bonding energy (E), with the (P V) term assumed to be negligible. We will refer to this enthalpy component (H) as the chemical work. A further distinction will be helpful. The change in the entropy (S) that accompanies the polymerization reaction may be divided into two distinct components which correspond to the changes in the thermal energy distribution (Sth) and the mass distribution (Sc), eq. 8-2. So we can rewrite eq. 8-4b as

G = H - TSth - T Sc (8-5)

that is,

(Gibbs free energy) = (Chemical work) - (Thermal entropy work) - (Configurational entropy work)

It will be shown that polymerization of macromolecules results in a decrease in the thermal and configurational entropies (Sth 0, Sc 0). These terms effectively increase G, and thus represent additional components of work to be done beyond the chemical work.

Consider the case of the formation of protein or DNA from biomonomers in a chemical soup. For computational purposes it may be thought of as requiring two steps: (1) polymerization to form a chain molecule with an aperiodic but near-random sequence, and (2) rearrangement to an aperiodic, specified information-bearing sequence.

[NOTE: Some intersymbol influence arising from differential atomic bonding properties makes the distribution of matter not quite random. (H.P. Yockey, 1981. J. Theoret. Biol. 91,13)].

The entropy change (S) associated with the first step is essentially all thermal entropy change (Sth), as discussed above. The entropy change of the second step is essentially all configurational entropy reducing change (Sc). In fact, as previously noted, the change in configurational entropy (Sc) = Sc "coding" as one goes from a random arrangement (Scr) to a specified sequence (Scm) in a macromolecule is numerically equal to the negative of the information content of the molecule as defined by Brillouin (see eq. 8-3a).

In summary, the formation of complex biological polymers such as DNA and protein involves changes in the chemical energy, H, the thermal entropy, Sth, and the configurational entropy, Sc, of the system. Determining the magnitudes of these individual changes using experimental data and a few calculations will allow us to quantify the magnitude of the required work potentially to be done by energy flow through the system in synthesizing macromolecules such as DNA and protein.

Quantifying the Various Components of Work

1. Chemical Work

The polymerization of amino acids to polypeptides (protein) or of nucleotides to polynucleotides (DNA) occurs through condensation reactions. One may calculate the enthalpy change in the formation of a dipeptide from amino acids to be 5-8 kcal/mole for a variety of amino acids, using data compiled by Hutchens.14 Thus, chemical work must be done on the system to get polymerization to occur. Morowitz15 has estimated more generally that the chemical work, or average increase in enthalpy, for macromolecule formation in living systems is 16.4 cal/gm. Elsewhere in the same book he says that the average increase in bonding energy in going from simple compounds to an E. coli bacterium is 0.27 ev/atom. One can easily see that chemical work must be done on the biomonomers to bring about the formation of macromolecules like those that are essential to living systems. By contrast, amino acid formation from simple reducing atmosphere gases (methane, ammonia, water) has an associated enthalpy change (H) of -50 kcal/mole to -250 kcal/ mole,16 which means energy is released rather than consumed. This explains why amino acids form with relative ease in prebiotic simulation experiments. On the other hand, forming amino acids from less-reducing conditions (i.e., carbon dioxide, nitrogen, and water) is known to be far more difficult experimentally. This is because the enthalpy change (H) is positive, meaning energy is required to drive the energetically unfavorable chemical reaction forward.
2. Thermal Entropy Work
Wickens17 has noted that polymerization reactions will reduce the number of ways the translational energy may be distributed, while generally increasing the possibilities for vibrational and rotational energy. A net decrease results in the number of ways the thermal energy may be distributed, giving a decrease in the thermal entropy according to eq. 8-2b (i.e., Sth 0). Quantifying the magnitude of this decrease in thermal entropy (Sth ) associated with the formation of a polypeptide or a polynucleotide is best accomplished using experimental results.

Morowitz18 has estimated that the average decrease in thermal entropy that occurs during the formation of macromolecules of living systems in 0.218 cal/deg-gm or 65 cal/gm at 298oK. Recent work by Armstrong et al.,19 for nucleotide oligomerization of up to a pentamer indicates H and -T Sth values of 11.8 kcal/mole and 15.6 kcal/mole respectively, at 294K. Thus the decrease in thermal entropy during the polymerization of the macromolecules of life increases the Gibbs free energy and the work required to make these molecules, i.e., -T Sth > 0.
3. Configurational Entropy Work
Finally, we need to quantify the configurational entropy change (Sc) that accompanies the formation of DNA and protein. Here we will not get much help from standard experiments in which the equilibrium constants are determined for a polymerization reaction at various temperatures. Such experiments do not consider whether a specific sequence is achieved in the resultant polymers, but only the concentrations of randomly sequenced polymers (i.e., polypeptides) formed. Consequently, they do not measure the configurational entropy (Sc) contribution to the total entropy change (S). However, the magnitude of the configurational entropy change associated with sequencing the polymers can be calculated.

Using the definition for configurational "coding" entropy given in eq. 8-2c, it is quite straightforward to calculate the configurational entropy change for a given polymer. The number of ways the mass of the linear system may be arranged (c) can be calculated using statistics. Brillouin20 has shown that the number of distinct sequences one can make using N different symbols and Fermi-Dirac statistics is given by

= N! (8-6)

If some of these symbols are redundant (or identical), then the number of unique or distinguishable sequences that can be made is reduced to

c = N! / n1!n2!n2!...ni! (8-7)

where n1 + n2 + ... + ni = N and i defines the number of distinct symbols. For a protein, it is i =20, since a subset of twenty distinctive types of amino acids is found in living things, while in DNA it is i = 4 for the subset of four distinctive nucleotides. A typical protein would have 100 to 300 amino acids in a specific sequence, or N = 100 to 300. For DNA of the bacterium E. coli, N = 4,000,000. In Appendix 1, alternative approaches to calculating c are considered and eq. 8-7 is shown to be a lower bound to the actual value.

For a random polypeptide of 100 amino acids, the configurational entropy, Scr, may be calculated using eq. 8-2c and eq. 8-7 as follows:

Scr = k lncr

since cr = N! / n1!n2!...n20! = 100! / 5!5!....5! = 100! / (5!)20

= 1.28 x 10115 (8-8)

The calculation of equation 8-8 assumes that an equal number of each type of amino acid, namely 5, are contained in the polypeptide. Since k, or Boltzmann's constant, equals 1.38 x 10-16 erg/deg, and ln [1.28 x 10115] = 265,

Scr = 1.38 x 10-16 x 265 = 3.66 x 10-14 erg/deg-polypeptide

If only one specific sequence of amino acids could give the proper function, then the configurational entropy for the protein or specified, aperiodic polypeptide would be given by

Scm = k lncm
= k ln 1
= 0
(8-9)

Determining scin Going from a Random Polymer to an Informed Polymer

The change in configurational entropy, Sc, as one goes from a random polypeptide of 100 amino acids with an equal number of each amino acid type to a polypeptide with a specific message or sequence is:

Sc = Scm - Scr

= 0 - 3.66 x 10-14 erg/deg-polypeptide
= -3.66 x 10-14 erg/deg-polypeptide (8-10)

The configurational entropy work (-T Sc) at ambient temperatures is given by

-T Sc = - (298oK) x (-3.66 x 10-14) erg/deg-polypeptide
= 1.1 x 10-11 erg/polypeptide
= 1.1 x 10-11 erg/polypeptide x [6.023 x 1023 molecules/mole] / [10,000 gms/mole] x [1 cal] / 4.184 x 107 ergs

= 15.8 cal/gm (8-11)

where the protein mass of 10,000 amu was estimated by assuming an average amino acid weight of 100 amu after the removal of the water molecule. Determination of the configurational entropy work for a protein containing 300 amino acids equally divided among the twenty types gives a similar result of 16.8 cal/gm.

In like manner the configurational entropy work for a DNA molecule such as for E. coli bacterium may be calculated assuming 4 x 106 nucleotides in the chain with 1 x 106 each of the four distinctive nucleotides, each distinguished by the type of base attached, and each nucleotide assumed to have an average mass of 339 amu. At 298oK:

-T Sc = -T (Scm - Scr)

= T ( Scr - Scm)

= kT ln (cr - lncm)

= kT ln [(4 x 106)! / (106)!(106)!(106)!(106)!] - kT ln 1

= 2.26 x 10-7 erg/polynucleotide

= 2.39 cal/gm 8-12

It is interesting to note that, while the work to code the DNA molecule with 4 million nucleotides is much greater than the work required to code a protein of 100 amino acids (2.26 x 10-7 erg/DNA vs. 1.10 x 10-11 erg/protein), the work per gram to code such molecules is actually less in DNA. There are two reasons for this perhaps unexpected result: first, the nucleotide is more massive than the amino acid (339 amu vs. 100 amu); and second, the alphabet is more limited, with only four useful nucleotide "letters" as compared to twenty useful amino acid letters. Nevertheless, it is the total work that is important, which means that synthesizing DNA is much more difficult than synthesizing protein.

It should be emphasized that these estimates of the magnitude of the configurational entropy work required are conservatively small. As a practical matter, our calculations have ignored the configurational entropy work involved in the selection of monomers. Thus, we have assumed that only the proper subset of 20 biologically significant amino acids was available in a prebiotic oceanic soup to form a biofunctional protein. The same is true of DNA. We have assumed that in the soup only the proper subset of 4 nucleotides was present and that these nucleotides do not interact with amino acids or other soup ingredients. As we discussed in Chapter 4, many varieties of amino acids and nucleotides would have been present in a real ocean---varieties which have been ignored in our calculations of configurational entropy work. In addition, the soup would have contained many other kinds of molecules which could have reacted with amino acids and nucleotides. The problem of using only the appropriate optical isomer has also been ignored. A random chemical soup would have contained a 50-50 mixture of D- and L-amino acids, from which a true protein could incorporate only the Lenantiomer. Similarly, DNA uses exclusively the optically active sugar D-deoxyribose. Finally, we have ignored the problem of forming unnatural links, assuming for the calculations that only CL-links occurred between amino acids in making polypeptides, and that only correct linking at the 3', 5'-position of sugar occurred in forming polynucleotides. A quantification of these problems of specificity has recently been made by Yockey.21

The dual problem of selecting the proper composition of matter and then coding or rearranging it into the proper sequence is analogous to writing a story using letters drawn from a pot containing many duplicates of each of the 22 Hebrew consonants and 24 Greek and 26 English letters all mixed together. To write in English the message,

HOW DID I GET HERE?

we must first draw from the pot 2 Hs, 2 Is, 3 Es, 2 Ds, and one each of the letters W, 0, G, T, and R. Drawing or selecting this specific set of letters would be a most unlikely event itself. The work of selecting just these 14 letters would certainly be far greater than arranging them in the correct sequence. Our calculations only considered the easier step of coding while ignoring the greater problem of selecting the correct set of letters to be coded. We thereby greatly underestimate the actual configurational entropy work to be done.

In Chapter 6 we developed a scale showing degrees of investigator interference in prebiotic simulation experiments. In discussing this scale it was noted that very often in reported experiments the experimenter has actually played a crucial but illegitimate role in the success of the experiment. It becomes clear at this point that one illegitimate role of the investigator is that of providing a portion of the configurational entropy work, i.e., the "selecting" work portion of the total -T Sc work.

It is sometimes argued that the type of amino acid that is present in a protein is critical only at certain positions---active sites---along the chain, but not at every position. If this is so, it means the same message (i.e., function) can be produced with more than one sequence of amino acids.

This would reduce the coding work by making the number of permissible arrangements cm in eqs. 8-9 and 8-10 for Scm greater than 1. The effect of overlooking this in our calculations, however, would be negligible compared to the effect of overlooking the "selecting" work and only considering the "coding" work, as previously discussed. So we are led to the conclusion that our estimate for Sc is very conservatively low.

Calculating the Total Work: Polymerization of Biomacromolecules

It is now possible to estimate the total work required to combine biomonomers into the appropriate polymers essential to living systems. This calculation using eq. 8-5 might be thought of as occurring in two steps. First, amino acids polymerize into a polypeptide, with the chemical and thermal entropy work being accomplished (H -T Sth). Next, the random polymer is rearranged into a specific sequence which constitutes doing configurational entropy work (-T Sc). For example, the total work as expressed by the change in Gibbs free energy to make a specified sequence is

G = H - T Sth - T Sc (8-13)

where H - T Sth may be assumed to be 300 kcal/mole to form a random polypeptide of 101 amino acids (100 links). The work to code this random polypeptide into a useful sequence so that it may function as a protein involves the additional component of T Sc "coding" work, which has been estimated previously to be 15.9 cal/gm, or approximately 159 kcal/mole for our protein of 100 links with an estimated mass of 10,000 amu per mole. Thus, the total work (neglecting the "sorting and selecting" work) is approximately

G = (300 + 159) kcal/mole = 459 kcal/mole (8-14)

with the coding work representing 159/459 or 35% of the total work.

In a similar way, the polymerization of 4 x 106 nucleotides into a random polynucleotide would require approximately 27 x 106 kcal/mole. The coding of this random polynucleotide into the specified, aperiodic sequence of a DNA molecule would require an additional 3.2 x 106 kcal/mole of work. Thus, the fraction of the total work that is required to code the polymerized DNA is seen to be 8.5%, again neglecting the "sorting and selecting" work.

The Impossibility of Protein Formation under Equilibrium Conditions

It was noted in Chapter 7 that because macromolecule formation (such as amino acids polymerizing to form protein) goes uphill energetically, work must be done on the system via energy flow through the system. We can readily see the difficulty in getting polymerization reactions to occur under equilibrium conditions, i.e., in the absence of such an energy flow.

Under equilibrium conditions the concentration of protein one would obtain from a solution of 1 M concentration in each amino acid is given by:

K= [protein] x [H2 0] / [glycine] [alanine]... (8-15)

where K is the equilibrium constant and is calculated by

K = exp [ - G / RT ] (8-16)

An equivalent form is

G = -RT ln K (8-17)

We noted earlier that G = 459 kcal/mole for our protein of 101 amino acids. The gas constant R = 1.9872 cal/deg-mole and T is assumed to be 298oK. Substituting these values into eqs. 8-15 and 8-16 gives

protein concentration = 10-338 M (8-18)

This trivial yield emphasizes the futility of protein formation under equilibrium conditions. In the next chapter we will consider various theoretical models attempting to show how energy flow through the system can be useful in doing the work quantified in this chapter for the polymerization of DNA and protein. Finally, we will examine experimental efforts to accomplish biomacromolecule synthesis.
 
Summary of Thermodynamics Discussion
Throughout Chapters 7-9 we have analyzed the problems of complexity and the origin of life from a thermodynamic point of view. Our reason for doing this is the common notion in the scientific literature today on the origin of life that an open system with energy and mass flow is a priori a sufficient explanation for the complexity of life. We have examined the validity of such an open and constrained system. We found it to be a reasonable explanation for doing the chemical and thermal entropy work, but clearly inadequate to account for the configurational entropy work of coding (not to mention the sorting and selecting work). We have noted the need for some sort of coupling mechanism. Without it, there is no way to convert the negative entropy associated with energy flow into negative entropy associated with configurational entropy and the corresponding information. Is it reasonable to believe such a "hidden" coupling mechanism will be found in the future that can play this crucial role of a template, metabolic motor, etc., directing the flow of energy in such a way as to create new information?

Specifying How Work Is To Be Done

Snipped Copy and Paste. -Intense.
 
Last edited by a moderator:
I understand it you however don't understand photosynthesis.

OMFG! You owe me a new keyboard.
What do you feel it is that I don't understand about it?
In your own words, I don't feel like sifting thru the wall of text that you no doubt don't understand.
 
Last edited by a moderator:
Todd why do you have a problem acknowledging the 2nd law all around us ?

econd Law of Thermodynamics - Does this basic law of nature prevent Evolution?




Photo copyrighted, Films for Christ.
Evolutionary tree&#8212;scene from the ORIGINS motion picture series.
Evolution versus a basic law of nature

Scores of distinguished scientists have carefully examined the most basic laws of nature to see if Evolution is physically possible - given enough time and opportunity. The conclusion of many is that Evolution is simply not feasible. One major problem is the 2nd Law of Thermodynamics.

law of science: basic, unchanging principle of nature; a scientifically observed phenomenon which has been subjected to very extensive measurements and experimentation and has repeatedly proved to be invariable throughout the known universe (e.g., the law of gravity, the laws of motion).

thermodynamics: the study of heat power; a branch of physics which studies the efficiency of energy transfer and exchange.1

Photo copyrighted, Films for Christ.
Decaying buildings. Massive structures may appear to be capable of lasting almost forever, but they will not. The need for ongoing repairs stems, in part, from the 2nd Law of Thermodynamics. (Scene from the ORIGINS motion picture series.)

The 2nd Law of Thermodynamics describes basic principles familiar in everyday life. It is partially a universal law of decay; the ultimate cause of why everything ultimately falls apart and disintegrates over time. Material things are not eternal. Everything appears to change eventually, and chaos increases. Nothing stays as fresh as the day one buys it; clothing becomes faded, threadbare, and ultimately returns to dust.2 Everything ages and wears out. Even death is a manifestation of this law. The effects of the 2nd Law are all around, touching everything in the universe.

........

"Living organisms, however, differ from inanimate matter by the degree of complexity of their systems and by the possession of a genetic program&#8230; The genetic instructions packaged in an embryo direct the formation of an adult, whether it be a tree, a fish, or a human. The process is goal-directed, but from the instructions in the genetic program, not from the outside. Nothing like it exists in the inanimate world."

SECOND LAW OF THERMODYNAMICS - Does this basic law of nature prevent Evolution? ? ChristianAnswers.Net
 
Last edited by a moderator:
You wanted the argument from the creationist point of view of concerning the 2nd law and how it affects evolution you have it. The points you take exception to we can discuss. I am not gonna continue exchanging jabs with you let's talk about the issues.
 
Last edited by a moderator:

Forum List

Back
Top