Print Page   |   Contact Us   |   Sign In   |   Join ASA or sign up
Sign In

Forgot your password?

Haven't registered yet?


A Public Conversation on Depression, Hope & Healing, Pleasant Hill, CA

1/25/2017 » 1/27/2017
Salk Symposium on Biological Complexity, La Jolla, CA

“Relevance Theory and Divine Accommodation in Genesis 1,” Wheaton, IL

“The Penultimate Curiosity: How science swims in the slipstream of ultimate questions,” Bristol, UK

“The Big Questions: Richard Dawkins versus C. S. Lewis on the Meaning of Life,” Houston, TX

American Journal of Physics
Moderator(s): Randall D. Isaac
Thread Score:
1 | 2 |
| 4 | 5 | 6
Thread Actions

3/27/2012 at 12:19:21 PM GMT
Posts: 24

@Dave Roemer - The Bible happens to be the only scripture that I know of that asserts that "God is Love."  Not that God is loving, but God actually is in His essence love.  Based upon my lay person's knowledge Christianity is unique amongst all religions in this assertion.  Ernest Valea explains it well on his comparative religions website:

"Likewise, when the Apostle John proclaims that "God is love" (1 John 4,8) this should not be interpreted as an expression of the impersonal primordial energy, but as form of expressing the supreme unity of the tri-personal communion. It doesn't just mean that God has love, as a quality, but that he is love, which is the way of being in the Trinity, each person existing not for himself, but for the others, in a perfect communion of love."

My conception of the Trinity begins with the notion that "God is Love."  God made man in His own image.  It makes sense to me that He would want man to experience that love.  Love desires the beloved to love in return.  I lean toward C. S. Lewis conception in this regard.  I think God gave man real choice, the kind that changes its mind.  Without choice it wouldn't be possible to love God back.  So God put the tree of the knowledge of good and evil in the garden so humans could have a choice.  God knows all the possible outcomes, but He leaves the choice up to us.  Quite a bit like indeterminacy in physics.  

3/27/2012 at 1:55:03 PM GMT
Posts: 60

A biological system is not a system of non-interacting particles. I don’t understand what a biological system has to do with the second law of thermodynamics except in the way biologists say it is connected to the 2nd law.

According to the 2nd law, a gas will fill up the entire container uniformly. The reason is that this is the most probable configuration of molecules. To calculate the probabilities, physicists label each of the identical molecules No. 1, No. 2, No. 3, etc. In other words, a deck of playing cards is a model for a gas because playing cards come automatically labeled, are identical, non-interacting, and isolated from any other system.

In their effort to understand evolution, biologists use an English sonnet as a model for the primary structure of a protein. Just as every letter in the sonnet has to be in the right place, every one of the 20 amino acids has to be in the right place. Biologists calculate how long it would take a computer to generate a sonnet with the random selection of words and letters and compare this number with 3 billion years.  

These calculations and the fact that the primary structure of a protein doesn’t even begin to describe the complexity of life mean that natural selection acting on innovation does not explain how evolution occurred over a period of only 3 billion years. Innovation includes, I suppose, random mutations, genetic engineering, and facilitate variation. But not enough is known about these sources of innovation to explain evolution. Natural selection only explains adaptation, not common descent. In other words, natural selection explains why giraffes have long necks but does not explain how giraffes evolved from bacteria.

The only theory that explains common descent is intelligent design, but there is no evidence for this theory. This is why it is not in peer-reviewed journals. Biologists don’t care about, or shouldn’t care about, the irrationality of intelligent design. Most advocates of intelligent design are Protestants, not Catholics.

If you have a gas in a closed container and you add heat to the gas, the temperature of the gas will increase. On the microscopic level, the average kinetic energy of the molecules will increase and the knowledge of the energy of molecules will decrease. On the macroscopic level, entropy increases. If you compress a gas by performing work on the gas and extracting heat, entropy will decrease. The American Journal of Physics articles imply that adding heat to a system can decrease its entropy.

I don’t think the American Scientific Affiliation is a Christian group. My guess is that your members believe in God in the way that John Dewey believed in God. One of the reasons I think this is because of their belief in Darwinism, which I consider pseudo-science. I’m unaware of any "differences in scientific matters.” You learn science by reading textbooks and peer-reviewed articles, and you prove you are right about science by quoting peer-reviewed articles. This is what I have done here.    

David Roemer

3/27/2012 at 5:29:14 PM GMT
Posts: 24

Hi David,

I've decided to do a little investigating so I have a question.  I know something about atmospheric conditions, currents, weather, etc.  Activity in the atmosphere starts with energy from the sun that enters from outside the influence of the gravitational field of the earth.  Uneven heating of the surface of the earth leads to air masses that differ in temperature, moisture, density and pressure.  Some energy is also lost from the earth's atmosphere via light emissions, and reflected light.  Does uneven heating of the atmosphere lead to increased or decreased entropy?  Does the heat move the atmosphere toward order or away from order?


Last edited Tuesday, March 27, 2012
3/27/2012 at 6:13:30 PM GMT
Posts: 139


  The second law of thermodynamics is not limited to a gas of non-interacting particles. It is a much broader, all-encompassing law that applies to all physical systems. The law says, quite simply, that the Gibbs free energy is minimized in every event. That applies everywhere. In the general case, adding heat to a system may or may not decrease the entropy. For a non-interacting system of molecules, the entropy would increase but in the general case of more complex systems, it may lead to a lower entropy. The second law doesn't preclude that.

As for probability, I think it is important to note that biologists do not calculate the probability of a protein in the manner you suggest. Proteins do not assemble that way. As in every complex multi-step chemical reaction, the detailed steps in formation of any molecule must be considered when assessing probabilities. We simply do not know all those steps adequately to be able to calculate any probability at all. We do know, however, that they are not assembled in one step of amino acids condensing into a chain, a la a deck of cards. Hence, the probability you suggest is not relevant.

You are right that in science we can do the study and analysis and usually determine what is the correct answer. Many scientists have considered this issue in detail and it is rather clear that evolution is not at all precluded by the second law of thermodynamics. Again, your extrapolation to the assessment of spiritual status is unwarranted and out of order, as well as inaccurate.


3/28/2012 at 1:19:38 AM GMT
Posts: 60

I said the second law does not apply to biological systems. I didn’t say it only applies to systems of non-interacting particles. This is the equation (see equation 3) in "Entropy and evolution” (AJPIAS_76_11_1031) I consider absurd:

"Entropy of biological system = Boltzmann’s constant × thermodynamic probability”

This is like saying,

"Temperature of a biological system = (3/2)Boltzmann’s constant × average kinetic energy of molecules”

A biological system doesn’t have a temperature. The only things that have temperatures are gases, liquids, and solids. Temperature is a macroscopic concept that we understand because of our sense of touch and because we can measure it with a thermometer. We can also measure the average kinetic energy of the molecules in a system and find out that the two are related by Boltzmann’s constant.

Let me try to explain it another way. The chance of shuffling a deck of cards and getting them back in the original order is 1/52! Does this mean the entropy of the deck of cards is Boltzmann’s constant divided by 52!?

The other unbelievably wacky thing about the article is the implication that adding heat to a system can decrease its entropy.

Biologists do probability calculations on sonnets to explain why natural selection can’t explain the evolution of the primary structure of a protein:

"By comparison, if we question how long it would take a high-speed computer to write randomly a specific Shakespearean sonnet, we are asking that all the letters of the words of the sonnet will come up simultaneously in the correct order. It is an impossible task, even if all the computers in the world today had been working from the time of the big bang to the present. Even to compose the phrase, "To be or not to be,” letter by letter, would take a typical computer millions of years. (Marc W. Kirschner and John C. Gerhart, The Plausiblity of Life: Resolving Darwin's Dilemma, page 32)

Kirscher and Gerhart reduced the "millions of years” to a "short time” by taking into consideration natural selection and facilitated variation. However, they never told us how long it would take a computer to generate a sonnet. The reason is that nobody cares. Only laymen think that natural selection explains the complexity of life.

I'v attached the first article. The second article is just a note about the first.

David Roemer

 Attached Files: 

3/28/2012 at 9:56:07 AM GMT
Posts: 60

I have a more fundamental problem with understanding the second law of thermodynamics.

In the free expansion of a gas, entropy increases. This is consistent with the second law of thermodynamics.

When you compress a gas and extract heat from it, the entropy of the gas decreases. This does not violate the second law because such a gas is not an isolated system.  

However, there seems to be another theory or explanation for why the entropy of a compressed gas decreases. This theory is that the entropy of whatever caused the compression increased more than the entropy of the gas decreased. I can’t understand this explanation. It seems to be based on the idea that the entropy of the universe always increases.

Suppose, the universe consists of a large number of molecules attracted to each other by the force of gravity. The molecules will come together and form a star, no? Hence, the entropy of the universe has decreased.

David Roemer

3/29/2012 at 6:59:22 AM GMT
Posts: 24

Hi Dave,

Not as good at stars as I am the atmosphere (mostly the math).  Again I tread into an area I am distinctly under-qualified to go, but hopefully someone with greater expertise can correct any misconceptions I have.

I start with the assumption that entropy in the universe always increases (2nd Law). To wrap my head around how entropy works in the atmosphere I first think of it as a closed system with one boundary at the earth's surface and the other boundary with space (the place where earth's gravity loses it's influence over matter).  With no tampering from outside the atmosphere it would reach uniform temperature, density, pressure and moisture inversely proportional by some factor to altitude--a state of maximum entropy.  No life could be sustained in such an atmosphere.

But the entropy of the earth's atmosphere is sustained at a level significantly below maximum. I think about how external energy sources like solar radiation could make this happen.  A majority of the solar radiation reaches the earth's surface where it is reflected or absorbed.  Meanwhile, energy is being lost at the boundary with space.  Each boundary becomes a transfer zone for energy into and out of the atmospheric system.  Entropy in the universe increases at both the earth's surface and at the boundary with space but the increase is significantly less at the earth's surface, resulting in a net decrease of entropy inside the atmosphere.  Also the heating of the earth's surface is uneven, more transfered to heat in some places than others, generally greater at the equator than the poles reducing the progress of entropy a bit more.  This sets up systematic wind patterns the water cycle and a host of other activity including living organisms.  All of these activities occur in the atmosphere because the system must move toward maximum entropy (2nd Law).  It turns out that the increase in entropy within the system exactly equals the net decrease due to the differential of entropy flux at the boundaries plus, earth's rotation and uneven heating of the surface, etc.  The heat entering the system sustains a level of overall order even though in the universe disorder continues to increase as the sun marches inexorably toward it eventual death.

The particular of rates of entropy flux in the atmosphere are impossible to measure precisely, they can only be estimated.  I can't imagine how hard such calculations must be for living organisms. 

Peixoto, et al (1991), Entropy Budget of the Atmosphere has been my go to article on the subject over the years (might need to update and get a more current one).  I can understand the English and most of the math in this one.  I appears to me that the math in the AJP article may bear a resemblance.  Perhaps someone with more expertise can help me out here.

EDIT: Wanted to add a note about the "quality" of energy entering and leaving the atmosphere.  The transfer of solar energy is much more orderly (has less entropy) than the energy lost into space, and more orderly than the energy contained in the atmosphere.  Perhaps that help clarify the point about net loss of entropy within the atmosphere when entropy increases on the scale of the universe.  The transfer of energy from the sun to the atmosphere via heating of the earth's surface increases entropy, but solar energy starts out quite focused and intense.

Last edited Thursday, March 29, 2012
3/30/2012 at 2:43:39 PM GMT
Posts: 60

Hi Sutherland,

 Does this mean you agree that Eq. 3 in the first AJP article is absurd?

David Roemer

4/8/2012 at 12:34:35 AM GMT
Posts: 24
D. Roemer said:

Hi Sutherland,

 Does this mean you agree that Eq. 3 in the first AJP article is absurd?

Not ready to make that statement. I want to see if the math (including equation 3) and the math I already understand in the meteorological article are related.  They seem similar to me and they both describe energy flux in open systems.  Unfortunately that will take me a little time so I might not be able to work on it immediately.  I would be happy if you or anyone else here were to take a look at it.

4/12/2012 at 5:30:41 AM GMT
Posts: 60
The editor of the AJP and the president elect of the AAPT suggested that I submit an article to the AJP explaining my approach to the issue. I didn’t feel qualified to write such an article, so I contacted a retired professor at New York University where I got my Ph.D. in quantum electrodynamics. This is what I said in the email to him (Robert Richardson):

"I am pretty sure that the entropy equations in the articles are nonsense, but I don’t know enough about statistical mechanics to explain why. They use the equation S = klogW, but there is no justification for the use of Boltzman’s constant for biological systems. Is there?
"If you are interested, I can email you my correspondence with the editor and publisher of the articles, and the pdf files of the two articles.”

Professor Richardson wrote back, and his answer confirmed what I told him about the equations:

"Hi David,
Nice to hear from you. The k in   S = klogW  is just a question of units and has no physical significance. logW  is dimensionless and  S has the dimensions of energy divided by temperature. k makes them match. In the "right” set of units  k = 1. Please send me your work as I am always interested.”

I sent him my work and the articles. This is the uncharitable and absurd response I got:
"Hi David,
I have spent some time with your work but am not able to make an informed comment. I do sometimes testify as an expert in court. But my rate is $400/hr portal to portal.  I doubt that you can afford a day of my time.”

I see the same kind of evasiveness in you and Randy Isaac. Emory Bunn, the author of the second paper, also said that he didn't have time to discuss the matter. It seems pretty simple to me. It makes as much sense to measure the temperature and entropy of a biological system with Boltzmann's constant as to measure the entropy and temperature of a deck of playing cards. The plastic the cards are made of have a temperature and entropy. The idea that an unshuffled deck of cards has a smaller entropy that a shuffled deck of cards is nonsense.

David Roemer