Entropy continued
As I explained before, entropy may appear to be a simple concept but is easily confused. A good example is the claim made by Jerry Bauer who claims to be using Feynman’s equation to calculate the entropy of mutations.
Let’s see what Jerry claims and then compare this to what he should have said. I pointed out to Jerry that Feynman presented a hint to Jerry as to how to calculate entropy correctly:
Feynman wrote:
We measure “disorder” by the number of ways that the insides can be arranged, so that from the outside it looks the same.
Let’s interpret this more carefully:
“From the outside”: Macro state “Ways that insides can be arranged”: Micro states
The following link shows what I have been stating namely that
Each specific sequence of heads or tails is just as likely to occur as any other and what we are really interested in is not the probability of a particular series of events but the probability of finding a sequence that has a particular “macroscopic” property. Macroscopic, in this context, means that we do not care what the order of the heads and tails is, but simply how many of the coins are heads or tails. This sort of question involves multiplicity rather than probability.
Now let’s see where Jerry goes wrong:
Jerry wrote:
You didn’t go on to show anything regarding the subject. You don’t understand the subject as you have shown three (nah..this will be four) times now. Simple common sense should tell you that Feynman was addressing logical entropy and you are embracing configurational entropy in the style of Adami’s work.
Feynman was addressing all possible microstates and he clearly states this in the link I sent you to:
“So we now have to talk about what we mean by disorder and what we mean by order. … Suppose we divide the space into little volume elements. If we have black and white molecules, how many ways could we distribute them among the volume elements so that white is on one side and black is on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure “disorder” by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy. The number of ways in the separated case is less, so the entropy is less, or the “disorder” is less.”
http://www.panspermia.org/seconlaw.htm
We furthered this to calculate the odds of certain microstates.
Jerry was correct, Feynman insisted on calculating all the microstates, but rather than calculating microstates, Jerry then makes the error of computing the probability of throwing 4 “heads” which is the same for “4 tails” or for “2 tails and 2 heads”, and thus not very interesting. But in addition to not being very interesting it also is an incorrect interpretation of Feynman’s entropy. In other words, Jerry did not apply the formula according to Feynman’s suggestion. Garbage in, garbage out.
Jerry wrote:
But Adami does not calculate all possible number of ways. He is using configurational entropy as is clearly shown when you correctly graph that 4 coins are tossed and they all come up tails the microstate is 1. It IS 1 were I considering configurational but I was not.
Since I was using Feynman’s definition, and the table contains macro and microstates, Jerry’s objection is irrelevant. I was using the Feynman formula with onky one noticable difference namely that I used it correctly.
Remember that Jerry was claiming that he was using Feynman entropy and yet now he shifts to :
Jerry wrote:
In logical entropy we calculate the statistics that our possible microstates will emerge.
If Jerry wants to abandon his use of Feynman’s formula that’s fine with me. Seems that Jerry may have become confused by Bruce Klyce’s treatment of entropy. I hope that these corrections will help Jerry understand where he went wrong.
Was it not Jerry who also stated
Chronos wrote:
But it also made the point I was wanting to make: configurational entropy and logical entropy are the same banana.
Logical or configurational entropy are the same banana Jerry? Could you please make up your mind?
The lecture notes not only show how to correctly calculate Feynman entropy but also how to relate probability and configuration. Which also shows where Jerry went wrong in calculating logical entropy. I leave that as an excercise to the interested reader. (Hint: \(p_i\) should be calculcated from observations.)
In another thread I am exploring the reluctance of ID proponents to self-correct. In that context I believe the following comment made by Jerry may be helpful
That first organism was designed. Any other theory is simply religion. Not science.
Of course the issue is not about design in nature but the nature of the designer. In other words, if we were to find design in nature, and if such evidence of teleology could be explained by natural causes then intelligent design would be powerless since it is based on an eliminative argument. In fact I argue that intrinsic teleology (function) is an inevitable outcome of evolutionary processes. Ayala has a very good overview of the issues
Unbounded design or contingent teleology occurs when the end-state is not specifically predetermined, but rather is the result of selection of one from among several available alternatives. The adaptations of organisms are designed, or teleological, in this indeterminate sense.
The following website provides a useful overview of “teleological notions in biology”.