Evolution of complexity, information and entropy
Since we have seen some poorly argued claims about entropy and its relevance to evolution, I will explore the concepts of entropy as they apply to genome evolution and will show that the evidence shows how simple processes like variation and selection are sufficient to explain the evolution of complexity or information/entropy in the genome.
While various ID authors (here and elsewhere) have argued that such natural processes are unable to explain the evolution of information in the genome, it should be clear that the actual evidence contradicts any such suggestions.
In the past I have argued with various people on the topic of entropy. Jerry Don Bauer a.k.a. Chronos has shown some interesting confusions as to the concept of entropy and believes that he has shown that using the laws of entropy he has shown that macro-evolution could not have happened.
First some background information
Jerry defines entropy and shows that entropy is always positive (no surprise here since entropy is the log of a number larger than or equal to 1. Based on the fact that entropy is positive he concludes that the tendency is positive and thus complex macro evolution has been disproven:
S = log2W, S = log2(100,000,000), S = 26.5754247590989, therefore S is positive showing a tendency of disorder. Complex macroevolution would have violated one of the most basic and well proven laws of science. And since we know that nothing violates a law of science as a tendency, we can most assuredly conclude that complex macroevolution never occurred.
Jerry can be seen backtracking in later responses:
I certainly do not mean to imply that this is my work: “if W, the number of states by some measure, is greater than 1 then S will be positive by your formula. Thus any number of states will be “showing a tendency of disorder.” This is not my work and was done much earlier by such greats as Boltzmann and Feynman et al.
further backing up and further obfuscating
I did state that that if S is positive, entropy is increased. And this is not a tendency in this case. It’s a fact of this specific example. I would ask you to examine your logic. If entropy increases then disorder has occurred. If S is positive then entropy has increased because ‘S’ IS the entropy we are considering. If you are going to continue in this vein of logic, then I will have to ask you to show how that the tenets of thermodynamics is just wrong in that everyone has it backward. Rising entropy denotes order and decreasing entropy denotes disorder.
Another whopper
P1: With every generation in homo sapien, entropy increases in the genome.
P2: Complex macroevolution requires that the genome have a lower entropy over time through the generations.
Therefore, complex macroevolution did not occur
Gedanken quickly exposes the fallacies in Chronos’s argument
By the way, Chronos has not demonstrated either of his premises P1 nor P2.
He has not demonstrated that the entropy must be increasing, simply because his argument confuses the positive value of entropy with a delta or change of entropy in a positive direction. Even if there were an argument that demonstrated this was a positive delta, Chronos has decided not to give such an argument and relies on the value being positive – an irrelevant issue.
Then Chronos has not demonstrated that change over time requires an decrease in entropy. (Or any particular change in entropy – for example changes occur and they are different, but they have the same number of informational or microstates and thus S has not changed.)
Anyone can decypher this one?
Begging your pardon, but it’s not me saying that when entropy is positive it “tends” toward disorder. When entropy is positive there is no longer a tendency involved. It has already happened. The reaction is over and a passed event. Therefore the term tendency no longer applies. And anytime entropy is positive the system has disordered:
Gedanken explains what is wrong with Chronos’s argument
So what is wrong with Jerry’s claims? Other than the confusion of tendency and value that is.
In fact some excellent papers are published by
which show how contrary to Jerry’s claims, entropy in the genome can decrease through the simple processes of variation and selection.
Despite the fact that Jerry seems to be blaming Feynman for his errors, it should be clear or soon become clear that Jerry is wrong.
I encourage the readers to pursue the thread I pointed out in which one can see how several people make significant effort to address the confusions exhibited by Jerry. If anything it shows why the abuse of mathematics appears to be so widespread.
As I have shown in some detail above, a correct application of entropy is not that complicated.
The following is a more indepth introduction to the exciting findings about entropy and information/complexity.
Schneider provides us with some interesting data
Information/entropy increase/decrease
Note how the information increases from zero to about 4 bits
From PNAS we find
Fig. 3. (A) Total entropy per program as a function of evolutionary time. (B) Fitness of the most abundant genotype as a function of time. Evolutionary transitions are identified with short periods in which the entropy drops sharply, and fitness jumps. Vertical dashed lines indicate the moments at which the genomes in Fig. 1 A and B were dominant.
In Evolution of biological complexity Adami et al show
To make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent information-theoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a sequence stores about its environment. We investigate the evolution of genomic complexity in populations of digital organisms and monitor in detail the evolutionary transitions that increase complexity. We show that, because natural selection forces genomes to behave as a natural “Maxwell Demon,” within a fixed environment, genomic complexity is forced to increase.
The approach is very simple first assume a genome with site i which has the following probabilities for the four nucleotides involved
One can show that the entropy for this site can be calculated to be
And the entropy tendency or information can be defined as
Now sum over all sites i and you find that the complexity or information is given by
Figure 3 above shows how entropy after an initial increase decreases at the same time the fitness increases. This information increase/entropy decrease is exactly what happens when selection and variation are combined. Figure 3 shows some beautiful examples of evolutionary transitions.
I am not the only one who has reached this obvious conclusion
Andya Primanda addresses the claim that “Can mutations increase information content? “ from Chapter 3 of The Evolution Deceit by Harun Yahya.
Some excellent websites which expand on the materials presented here can be found
Adami: Evolutionary Biology and Biocomplexity
and
ev: Evolution of Biological Information
A recent paper which identifies some problems with Schneider’s approach can be found here. Despite the problems, the authors recover most of the same conclusions.
Empirically, it has been observed in several cases that the information content of transcription factor binding site sequences (Rsequence) approximately equals the information content of binding site positions (Rfrequency). A general framework for formal models of transcription factors and binding sites is developed to address this issue. Measures for information content in transcription factor binding sites are revisited and theoretic analyses are compared on this basis. These analyses do not lead to consistent results. A comparative review reveals that these inconsistent approaches do not include a transcription factor state space. Therefore, a state space for mathematically representing transcription factors with respect to their binding site recognition properties is introduced into the modelling framework. Analysis of the resulting comprehensive model shows that the structure of genome state space favours equality of RSequence and RFrequency indeed, but the relation between the two information quantities also depends on the structure of the transcription factor state space. This might lead to significant deviations between RSequence and RFrequency . However, further investigation and biological arguments show that the effects of the structure of the transcription factor state space on the relation of RSequence and RFrequency are strongly limited for systems which are autonomous in the sense that all DNA binding proteins operating on the genome are encoded in the genome itself. This provides a theoretical explanation for the empirically observed equality.