Response to "Dissecting Dembski's 'Complex Specified Information'"
Thomas Schneider (Molecular Information Theory Group A, Laboratory of
Experimental and Computational Biology at the National
Institutes of Health in Frederick, MD) has written an article entitled,
"Dissecting Dembski's 'Complex Specified Information'"1
claiming that specified complexity can be produced though evolutionary
mechanisms. In this article, the
reproducibility by chance figure Schneider achieved by computer evolution
was so intriguingly small (5x10-20), that I couldn't believe it
was real. I got Dembski's book, No Free Lunch,2 and looked into
Schneider's Ev program3 on the internet, and found interesting discussions
in a couple of newsgroups. The following is my impression of how well
Dembski's "specified complexity" holds up.
Dembski must have felt Schneider's Ev program was enough of a challenge that he addressed it in his book (pp. 213-218). Schneider's standard program simulates a genome that evolves 16 regulatory binding sites and a single regulatory gene that makes a protein that binds to the sites. After 704 generations, the winner binds to the sites with 64 bits of specificity, with no mistakes binding to the non-regulatory parts of the genome. The population is held constant during the whole run by killing half of the progeny that make the most mistakes in binding to the 16 sites each generation. The unique thing about the program is that the genome, including the 16 sites, 1 regulatory gene, and the binding weight matrix/threshold are randomly initialized. This allows Schneider to make the claim that evolution can create "specified complexity" that isn't smuggled in by a designer.
Dembski points out in his book that Schneider smuggles in "specified complexity" just by having a fitness function, but its not a weighty argument in my opinion because you essentially get that for free with any organism that replicates where there are scarce resources. However, Dembski does make a substantial point about the smoothness of the fitness function smuggling in "specified complexity." Nature does not produce reproductive fitness functions that without stress are perfectly determinative in a single generation over all individuals like Schneider's fitness function is. Even at the base level, an individual binding site's specificity is smooth in Schneider's program instead of being shape dependent, like proteins are. A single base being off only affects its opposite partner in the Ev program, without considering how the binding protein's amino acids match up with nearby nucleic acid bases and its backbone in the regulatory region. A real regulatory protein is tuned to the shape of the binding site, where changes to one base partially affect the binding of neighboring bases instead of the binding of bases being individually tunable. Schneider's weight matrix, which he says simulates the binding and transcription of amino acids, doesn't comprehend this. Furthermore, development of complex structure seems to involve a cascade of genes synchronized with other cascades in different parts of the body, through signaling interactions. However, optimizing this traverses a phase space that is neither smooth nor monotonic. I think it's reasonable to assume that many gene cascades have a negative selection coefficient until they are in their final form - a sort of irreducible complexity. Take for instance the manufacture of a tear duct that goes from the corner of your lower eyelid through the bone into the back of the nasal cavity. This is only an analogy, but if we assume one gene codes for the x, and another for the y and z positions of the source of the tube, then none of the gene's regulatory regions could be optimized apart from the other. In addition, it has to be optimized in concert with the destination x, y, and z position, tube size, and tube material, etc. Evolution is impotent at explaining this kind of complexity, and even more so, molecular machines.
I had somewhat of an epiphany when I recognized how Schneider was able to get such a high level of seeming complexity in a computer simulation that, on its surface, didn't seem to smuggle in much "specified complexity." It had to do with setting the mutation rate to only one per organism per generation. With many mutations per generation, it's hard to target selection to just one binding site. By limiting it to just one mutation and providing an overall smooth fitness function for the organism that is additive on those from individual sites, its as though an organism has just 1 binding site which is optimized by selection. This means that if you doubled the number of binding sites to 32 per organism you would probably just have to double or quadruple the number of generations to get a specificity of better than 1 part in 1038. Complexity should not be defined as something that is achievable linearly in time by an unintelligent computer. Schneider makes it appear that big exponents in the information of a complex feature is actually achievable in nature by evolution. However, we know that this level of fine tuning is extremely difficult to achieve for single individual traits produced by cooperating gene cascades, and is only conceivable when considering the totality of a number of minimally-complex, independently-optimizable traits in an organism. Dembski needs to further refine his definition of complexity to consider this. Perhaps this would involve something that quantifies not only the information content, but the complexity of the landscape that the information resides on also. Mutual dependence in function and timing adds to the prevalence of peaks and valleys.
Schneider et al's other critique seems to be to discount the specification part in "specified complexity" as unmeaningful to science. This has some support due to Dembski's own remark 2.5.1 that "Detachability is always relativized to a subject or subjects possessing certain background knowledge." Above it, Dembski includes detachability as something necessary for his formal way of determining if something has specification. In other words, you can never prove that something has "specified complexity" because you may not have found the individual with the right knowledge, or that knowledge may not yet exist. However, you can prove that something doesn't have "specified complexity." Evolutionists have just as significant a problem. They should be able to suggest functional genetic intermediates between an irreducibly complex function and another function from which it supposedly diverged. In addition, these intermediates should be located no farther away from each other than random chance could reasonably be expected to jump in the time since divergence. There are a limited number of test cases evolution would have had to create complexity. We should be able to get a rough figure for this. I would guess it's something like 10 thousand trillion for land vertebrates. Duplicated pseudogenes or non-coding regions in the genome of the size of a typical gene, with no "stop" code, only remain within a genome for a limited amount of time before they become too short to be useful for anything. It seems to me that we might try researching what kind of complexity a genetic algorithm can produce with this limited number of test cases. I bet it's not much.
Molecular Evolution by Wen Hsiung Li,4 says "There is now ample evidence that gene duplication is the most important mechanism for generating new genes and new biochemical processes that have facilitated the evolution of complex organisms from primitive ones." Assuming this is true and that even Darwinists admit there must be at least something like 512 steps to create a new function, we would expect to find many gene duplicates that are genetic intermediates - close to the size of an average gene (>300 codons), without a "stop" codon in the coding part, and with a properly located and at least minimally effective promoter and TFIIB binding regions. There is also likely the need for the intermediate to have a sequence-specific region nearby to trigger DNA rearrangement and expose the promoter region for transcription. A secondary mechanism for the source of genetic raw material for the evolution of complexity, namely the co-opting of one of the two alleles of a polymorphic gene and slowly modifying one of them, has a similar expectation, in that we should expect to find many alleles in our genome that have different shapes and functions, with both having a positive selection coefficient in the same gene. These things should be able to be verified now that the physical genome mapping of humans is basically complete. So far, no evolutionist has provided any genetic evidence of the plethora of intermediates that would be necessary to affirm their mechanism of producing complexity. If anyone has substantial evidence for the evolution of complexity or wants to engage in a technical discussion on the issue, please write through the email link below.
Related Resources 
Reasons
To Believe's third in a series of books proposing a testable creation model
takes on the origin and design of the universe. Previous books,
Origins of Life: Biblical and Evolutionary Models Face Off
and Who
Was Adam?: A Creation Model Approach to the Origin of Man, examined the
origin of life on earth and the origin of mankind, respectively.
Creation As Science develops a biblical creation model and compares
the predictions of this model compared to a naturalistic model, young earth
creationism, and theistic evolution. This biblical creation model is divided
into four main areas, the origin of the universe, the origin of the Solar
System, the history of life on earth, and the origin and history of mankind.
The Edge of Evolution: The Search for the Limits of Darwinism by Michael Behe
Darwin's Black Box author Michael Behe takes on the limits of evolution through an examination of specific genetic examples. Behe finds that mutation and natural selection is capable of generating trivial examples of evolutionary change. Although he concludes that descent with modification has occurred throughout biological history, the molecular devices found throughout nature cannot be accounted for through natural selection and mutation. Behe's book claims to develop a framework for testing intelligent design by defining the principles by which Darwinian evolution can be distinguished from design.
References 
- Schneider, T.D. 2002. Dissecting Dembski's "Complex Specified Information". ().
- Dembski, W.A. 2001. No Free Lunch. Rowman & Littlefield.
- Schneider, T.D. 2002. ev: Evolution of Biological Information.
- Dan Graur and Wen-Hsiung Li. 1997.
Fundamentals of Molecular Evolution. Sinauer Assoc. P. 269.
Today's New Reason To Believe
[Active Archive Link, Click Above]
- 10/27/2008 12:05 AM
A Dark Galaxy: Finding the “Missing” Dark Stuff
Previously Posted January 28th, 2008 by Hugh Ross Many independent sets of observations confirm that only about six percent of all the ordinary matter (neutrons and protons) in the universe is made up of stars and stellar remnants.1 The other 94 percent is dark. While astronomers have verified ... - 10/24/2008 12:05 AM
From Dust to Planets
Previously Posted on October 26th, 2007 by David H. Rogstad, Ph.D. When the Bible tell us that we have been made from the “dust of the earth,” and will eventually “return to the dust,” it is more true than perhaps the authors realized. Prevailing theory for the formation ... - 10/23/2008 12:05 AM
Flightless Birds Run Down Evolution
Posted by Fazale ‘Fuz’ Rana, Ph.D. Newly Discovered Example of Convergence Challenges Biological Evolution Lately, my wife has had trouble hanging onto cell phones. Within the span of two weeks she lost not one cell phone, but two. But my wife is not the only one who has lost the ... - 10/22/2008 12:05 AM
Antarctica and North America Once Connected
by Jeffrey Zweerink From a biblical perspective, the advent of continents plays a critical role in God’s transformation of Earth from “formless and void” to an environment teeming with diverse life-forms. In fact, the formation of continents warrants mention as one of the ... - 10/21/2008 12:05 AM
The Golden Rule of Apologetics, Part 6 (of 7)
Kenneth Richard Samples What can Christian apologists do to represent the arguments of others with fairness and intellectual integrity? This series has focused upon the need to apply the biblical principle of the Golden Rule to the enterprise of Christian apologetics. In Matthew 7:12, Jesus ... - 10/20/2008 12:05 AM
Rare Solar System Location
by Hugh Ross Many astronomers have noted that the present solar system environment is amazingly benign for advanced life. The solar system’s current position (in between two nearby exceptionally symmetrical and widely separated spiral arms that are devoid of any significant spurs or ... - 10/17/2008 05:51 AM
A Family Breakup Kills the Dinosaurs
Previously posted on Oct. 19th, 2007 by David H. Rogstad, Ph.D. In 1980 Nobel Prize-winning physicist Luis Alvarez and his geologist son Walter Alvarez proposed a provocative theory. They suggested that the Earth had been struck by an asteroid in its past to explain the unusual amount of iridium ...
"Last Modified March 18, 2004