Dr Collins, top geneticist, and CHRISTIAN....

Another problem for your theory.


Papers
Neo-Darwinian history of life
The neo-Darwinian theory of evolution says that all
life on Earth arose from a common ancestor via
random mutations to genes that survived natural selection.
Organisms with beneficial mutations produced more
offspring, those with deleterious mutations produced less
(or no) offspring. Some beneficial mutations provided
new adaptations to new or changed environments, thereby
producing new kinds of organisms.
In order to get extra new information that was needed
to build more complex organisms, existing genes must have
been duplicated and then mutated into something useful.1
This error-ridden process was not very efficient and it left
behind in our genomes lots of ‘genetic fossil junk’:2
• mutations that were neutral and not selected out,
• duplicated genes that didn’t make it,
• functional genes that mutated beyond their
usefulness.
As a result, our ‘[evolutionary] history is woven into
the fabric of modern [life] inscribed in its coded characters.’3
This code, of course, is on the DNA molecule in the form
of a long string of letters consisting of four base molecules,
commonly represented by the initials T, A, G and C. The
code was thought to be written in a straightforward manner,
like the letters and words on this page. Neo-Darwinian
molecular taxonomists routinely use this ‘junk DNA’ as a
‘molecular clock’—a silent record of mutations that have
been undisturbed by natural selection for millions of years
because it does not do anything. They have constructed
elaborate evolutionary histories for all different kinds of
life from it. How wrong this has proved to be!
The ENCODE Project
When the Human Genome Project published its first
draft of the human genome in 2003, they already knew
certain things in advance. These included:
• Coding segments (genes that coded for proteins) were
a minor component of the total amount of DNA in each
cell. It was embarrassing to find that we have only about
as many genes as mice (about 25,000) which constitute
only about 3% of our entire genome. The remaining
97% was of largely unknown function (probably the
‘junk’ referred to above).
• Genes were known to be functional segments of DNA
(exons) interspersed with non-functional segments
(introns) of unknown purpose. When the gene is copied
(transcribed into RNA) and then translated into protein
the introns are spliced out and the exons are joined up
to produce the functional protein-producing gene.
• Copying (transcription) of the gene began at a specially
marked START position, and ended at a special STOP
sign.
• Gene switches (the molecules involved are collectively
called transcription factors) were located on the
chromosome adjacent to the START end of the gene.
• Transcription proceeds one way, from the START end
to the STOP end.
• Genes were scattered throughout the chromosomes,
somewhat like beads on a string, although some areas
were gene-rich and others gene-poor.
• DNA is a double helix molecule, somewhat like a
coiled zipper. Each strand of the DNA zipper is the
complement of the other—as on a clothing zipper,
one side has a lump that fits into a cavity on the other
strand. Only one side is called the ‘sense’ strand, and
the complementary strand is called the ‘anti-sense’
strand. Protein production usually only comes from
copying the sense strand. The anti-sense strand provides
a template for copying the sense strand in a way that
a photographic negative is used to produce a positive
print. Some exceptions to this rule were known (in some
cases, anti-sense strands were used to make protein).
This whole structure of understanding was turned on
Astonishing DNA complexity demolishes
neo-Darwinism
Alex Williams
The traditional understanding of DNA has recently been transformed beyond recognition. DNA does not, as
we thought, carry a linear, one-dimensional, one-way, sequential code—like the lines of letters and words on
this page. And the 97% in humans that does not carry protein-coding genes is not, as many people thought,
fossilized ‘junk’ left over from our evolutionary ancestors. DNA information is overlapping-multi-layered and
multi-dimensional; it reads both backwards and forwards; and the ‘junk’ is far more functional than the protein
code, so there is no fossilized history of evolution. No human engineer has ever even imagined, let alone
designed an information storage device anything like it. Moreover, the vast majority of its content is metainformation—
information about how to use information. Meta-information cannot arise by chance because
it only makes sense in context of the information it relates to. Finally, 95% of its functional information shows
no sign of having been naturally selected; on the contrary, it is rapidly degenerating! That means Darwin was
wrong—natural selection of natural variation does not explain the variety of life on Earth. The best explanation
is what the Bible tells us: we were created—as evidenced by the marvels of DNA—but then we fell and now
endure the curse of ‘bondage to decay’ by mutations.
112 JOURNAL OF CREATION 21(3) 2007
Papers
its head by a project called ENCODE that recently
reported an intensive study of the transcripts (copies
of RNA produced from the DNA) of just 1% of the
human genome.4,5 Their findings include the following
inferences:
• About 93% of the genome is transcribed (not 3%,
as expected). Further study with more wide-ranging
methods may raise this figure to 100%. Because much
energy and coordination is required for transcription
this means that probably the whole genome is used by
the cell and there is no such thing as ‘junk DNA’.
• Exons are not gene-specific but are modules that can
be joined to many different RNA transcripts. One exon
(i.e. a protein-making portion of one gene) can be used
in combination with up to 33 different genes located
on as many as 14 different chromosomes. This means
that one exon can specify one part shared in common
by many different proteins.
• There is no ‘beads on a string’ linear arrangement of
genes, but rather an interleaved structure of overlapping
segments, with typically five, seven or more transcripts
coming from just one segment of code.
• Not just one strand, but both strands (sense and antisense)
of the DNA are fully transcribed.
• Transcription proceeds not just one way but both
backwards and forwards.
• Transcription factors can be tens or hundreds of
thousands of base-pairs away from the gene that they
control, and even on different chromosomes.
• There is not just one START site, but many, in each
particular gene region.
• There is not just one transcription triggering (switching)
system for each region, but many.
The authors concluded:
‘An interleaved genomic organization poses
important mechanistic challenges for the cell. One
involves the [use of] the same DNA molecules for
multiple functions. The overlap of functionally
important sequence motifs must be resolved in time
and space for this organization to work properly.
Another challenge is the need to compartmentalize
RNA or mask RNAs that could potentially form
long double-stranded regions, to prevent RNARNA
interactions that could prompt apoptosis
[programmed cell death].’
The problem of using the same code to produce
many different functional transcripts means that DNA
cannot be endlessly mutable, as neo-Darwinists assume.
Most mutations are deleterious, so mutations in such a
complex structure would quickly destroy many functions
at once.
Their concern for the safety of so many RNA molecules
in such a small space is also well founded. RNA is a long
single-strand molecule not unlike a long piece of stickytape—
it will stick to any nearby surface, including itself!
Unless properly coordinated, it will quickly tie itself up
into a sticky mess.
These results are so astonishing, so shocking, that it is
going to take an awful lot more work to untangle what is
really going on in cells.
Functional junk?
The ENCODE project did confirm that genes still form
the primary information needed by the cell—the proteinproducing
code—even though much greater complexity has
now been uncovered. Genes found in the ENCODE project
differ only about 2% from the existing catalogue.
The astonishing discovery of multiple overlapping
transcripts in every part of the DNA was amazing in itself,
but the extent of the overlaps are huge compared to the
size of a typical gene. On average, the transcripts are 10 to
50 times the size of a typical gene region, overlapping on
both sides. And as many as 20% of transcripts range up to
more than 100 times the size of a typical gene region. This
would be like photocopying a page in a book and having
to get information from 10, 50 or even 100 other pages in
order to use the information on that one page.
The non-protein-coding regions (previously thought
to be junk) are now called untranslated regions (UTRs)
because while they are transcribed into RNA, they are
not translated into protein. Not only has the ENCODE
project elevated UTRs out of the ‘junk’ category, but it now
appears that they are far more active than the translated
regions (the genes), as measured by the number of DNA
bases appearing in RNA transcripts. Genic regions are
transcribed on average in five different overlapping and
interleaved ways, while UTRs are transcribed on average
in seven different overlapping and interleaved ways. Since
there are about 33 times as many bases in UTRs than in
genic regions, that makes the ‘junk’ about 50 times more
active than the genes.
Transcription activity can best be predicted by just
one factor—the way that the DNA is packaged into
chromosomes. The DNA is coiled around protein globules
called histones, then coiled again into a rope-like structure,
then super-coiled in two stages around scaffold proteins
to produce the thick chromosomes that we see under the
microscope (the resulting DNA/protein complex is called
chromatin). This suggests that DNA information normally
exists in a form similar to a closed book—all the coiling
prevents the coded information from coming into contact
with the translation machinery. When the cell wants some
information it opens a particular page, ‘photocopies’ the
information, then closes the book again. Recent other work6
shows that this is physically accomplished as follows:
• The chromosomes in each cell are stored in the
membrane-bound nucleus. The nuclear membrane
has about 2,000 pores in it, through which molecules
need ‘permission’ to pass in and out. The required
chromosome is brought near to one of these nuclear
pores.
• The section of DNA to be transcribed is placed in front
of the pore.
• The supercoil is unwound to expose the transcription
region.
JOURNAL OF CREATION 21(3) 2007 113
Papers
• The histone coils are twisted to expose
the required copying site.
• The double-helix of the DNA is
unzipped to expose the coded
information.
• The DNA is grasped into a loop by the
enzymes that do the copying, and this
loop is copied onto an RNA transcript.
The transcript is then checked for
accuracy (and is corrected or degraded
and recycled if it is faulty). Accurate
RNA transcripts are then specially
tagged for export through the pore
and are carried to wherever they are
needed in the cell.
• The ‘book’ of DNA information
is then closed by a reversal of the
coiling process and movement of the
chromosome away from the nuclear
pore region.
This astonishing discovery that
the so-called ‘junk’ regions are far more
functionally active than the gene regions
suggests that probably none of the human
genome is inactive junk. Junk is, by
definition, useless (or at least, presently
unused). But UTRs are being actively
used right now. That means they are not fossils of bygone
evolutionary ages—they are being used right now because
they are needed right now! If other animals have similar
DNA sequences then it means they have similar needs
that we do. This is sound logic based upon observable
biology—as opposed to the fanciful mutational suppositions
of neo-Darwinism.7
The molecular taxonomists, who have been drawing
up evolutionary histories (‘phylogenies’) for nearly every
kind of life, are going to have to undo all their years of
‘junk DNA’-based historical reconstructions and wait for
the full implications to emerge before they try again. One
of the supposedly ‘knock-down’ arguments that humans
have a common ancestor with chimpanzees is shared
‘non-functional’ DNA coding. That argument is now out
the window.
Multiple Codes
A major outcome of the studies so far is that there are
multiple information codes operating in living cells. The
protein code is the simplest, and has been studied for half
a century. But a number of other codes are now known, at
least by inference.
Cell memory code. DNA is a very long, thin molecule.
If you unwound the DNA from just one human cell it would
be about 2 metres long! To squash this into a tiny cell
nucleus, the DNA is wound up in four separate layers of
chromatin structure (as described earlier). The first level
of this chromatin structure carries a ‘histone code’ that
contains information about the cell’s history (i.e. it is a cell
memory).8,9 The DNA is coiled twice around a group of
8 histone molecules, and a 9th histone pins this structure
into place to form what is called a nucleosome. These
nucleosomes can carry various chemical modifications that
either allow, or prevent, the expression of the DNA wrapped
around them. Every time a cell divides into two new cells,
its DNA double-helix splits into two single strands, which
then each produce a new double-strand. But nucleosomes
are not duplicated like the DNA-strands. Rather, they are
distributed between either one or the other of the two new
DNA double strands, and the empty spaces are filled by new
nucleosomes. Cell division is therefore an opportunity for
changes in the nucleosomal composition of a specific DNA
region. Changes can also happen during the lifetime of a
cell due to chemical reactions allowing inter-conversions
between the different nucleosome types. The memory
effect of these changes can be that a latent capacity that was
dormant comes to life, or, conversely, a previously active
capacity shuts down.
Differentiation code. In humans, there are about 300
different cell types in our bodies that make up the different
tissue types (nerves, blood, muscle, liver, spleen, eyes
etc). All of these cells contain the same DNA, so how
does each cell know how to become a nerve cell rather
than a blood cell? The required information is written in
code down the side of the DNA double-helix in the form
of different molecules attached to the nucleotides that form
the ‘rungs’ in the ‘ladder’ of the helix.10 This code silences
developmental genes in embryonic stem cells, but preserves
their potential to become activated during embryogenesis.
Astonishing complexity of DNA. When the genetic code was first discovered, it was
thought that only protein information was coded in gene regions. Genes make up only
about 3% of the human genome. Francis Crick described the remaining 97% as ‘junk’.
But recent discoveries show that so much information is packed into, on and around
the DNA molecule that it is the most complex and sophisticated information storage
system ever seen by mankind. No one ever imagined such a thing before, and we are
still trying to understand the nature and depth of its information content.
By Alex Williams
114 JOURNAL OF CREATION 21(3) 2007
Papers
The embryo itself is largely defined by its DNA sequence,
but its subsequent development can be altered in response to
lineage-specific transcriptional programs and environmental
cues, and is epigenetically maintained.11
Replication Code. The replication code was discovered
by addressing the question of how cells maintain their
normal metabolic activity (which continually uses the
DNA as source information) when it comes time for cell
division. The key problem is that a large proportion of the
whole genome is required for the normal operation of the
cell—probably at least 50% in unspecialized body cells
and up to 70–80% in complex liver and brain cells—and,
of course, the whole genome is required during replication.
This creates a huge logistic problem—how to avoid clashes
between the transcription machinery (which needs to
continually copy information for ongoing use in the cell)
and the replication machinery (which needs to unzip the
whole of the DNA double-helix and replicate a ‘zipped’
copy back onto each of the separated strands).
The cell’s solution to this logistics nightmare is truly
astonishing.12 Replication does not begin at any one point,
but at thousands of different points. But of these thousands
of potential start points, only a subset are used in any one
cell cycle—different subsets are used at different times and
places. A full understanding is yet to emerge because the
system is so complex; however, some progress has been
made:
• The large set of potential replication start sites is not
essential, but optional. In early embryogenesis, for
example, before any transcription begins, the whole
genome replicates numerous times without any
reference to the special set of potential start sites.
• The pattern of replication in the late embryo and adult
is tissue-specific. This suggests that cells in a particular
tissue cooperate by coordinating replication so that
while part of the DNA in one cell is being replicated,
the corresponding part in a neighbouring cell is being
transcribed. Transcripts can thus be shared so that
normal functions can be maintained throughout the
tissue while different parts of the DNA are being
replicated.
• DNA that is transcribed early in the cell division cycle
is also replicated in the early stage (but the transcription
and replication machines are carefully kept apart).
The early transcribed DNA is that which is needed
most often in cell function. The correlation between
transcription and replication in this early phase allows
the cell to minimize the ‘downtime’ in transcription
of the most urgent supplies while replication takes
place.
• There is a ‘pecking order’ of control. Preparation for
replication may take place at thousands of different
locations, but once replication does begin at a particular
site, it suppresses replication at nearby sites so that
only one copy of the DNA is made. If transcription
happens to occur nearby, replication is suppressed until
transcription is completed. This clearly demonstrates
that keeping the cell alive and functioning properly
takes precedence over cell division.
• There is a built-in error correction system called the
‘cell-cycle checkpoints’. If replication proceeds without
any problems, correction is not needed. However, if too
many replication events occur at once the potential for
conflict between transcription and regulation increases,
and/or it may indicate that some replicators have
stalled because of errors. Once the threshold number is
exceeded, the checkpoint system is activated, the whole
process is slowed down, and errors are corrected. If too
much damage occurs, the daughter cells will be mutant,
or the cell’s self-destruct mechanism (the apoptosome)
will be activated to dismantle the cell and recycle its
components.
• An obvious benefit of the pattern of replication initiation
being never the same from one cell division to the next
is that it prevents accumulation of any errors that are
not corrected.
The exact location of the replication code is yet to
be pinpointed, but because it involves transcription factors
gaining access to transcription sites, and this is known to
be controlled by chromatin structure, then the code itself is
probably written into the chromatin structure.
Undiscovered codes?
Given that we now have at least four known codes, it
seems reasonable to infer that at least three other major
activities in cells have a, yet undiscovered, coded basis:
Regulatory code(s). At least some, and perhaps all, of
the untranslated regions are involved in gene regulation in
one form or another. According to Kirschner and Gerhart’s
facilitated variation theory,13 regulatory information is
organized into modules that they liken to Lego® blocks. That
is, they have strong internal integrity (hard to break), but they
are easily pulled apart (during meiosis) and rearranged into
new combinations (at fertilization). This built-in capacity
for variation, they claim, is essential for life to persist (i.e.
survive stress in any one generation) and evolve (down the
generations). It therefore seems reasonable to expect to find
some code associated with module structure, the rules by
which rearrangements can occur, and the constraints that
must apply in order to maintain normal metabolic functions
in the face of rearranged regulatory circuits.
Transcription code. The most common activity
associated with DNA is transcription. But how does a
transcription signal know which version of a transcript is
required? For any given gene, there are numerous different
transcript-starting sites, a number of different overlap
options, and numerous different signal molecules that can
trigger a transcription request, but only one transcription
machine can operate on a given segment of DNA at any
one time.
Nerve code. Nerve cells carry information internally via
electrical impulses, but then communicate with one another
by converting electrical impulses into neurotransmitter
JOURNAL OF CREATION 21(3) 2007 115
Papers
chemicals that then diffuse across the gap (synapse) between
them. There must be a code involved in conversion from
electrical to chemical transcription information.
Meta-information: an impossible conundrum
for evolution
The astonishing complexity of the dynamic information
storage capacity of the DNA/chromosome system is, in
itself, a marvel of engineering design. Such a magnificent
solution to such a monster logistics problem could surely
only come from a Master Designer. But the nature of the
majority of this information poses an impossible conundrum
for neo-Darwinists.
Proteins are the work-horse molecules of biology. But
protein-coding genes make up only a tiny proportion of
all the information that we have been describing above.
The vast majority of information in the human genome
is not primary code for proteins, but meta-information—
information about information—the instructions that a cell
needs for using the proteins to make, maintain and reproduce
functional human beings.
Neo-Darwinists say that all this information arose
by random mutations, but this is not possible. Random
events are, by definition, independent of one another.
But meta-information is, by definition, totally dependent
upon the information to which it relates. It would be quite
nonsensical to take the cooking instructions for making a
cake and apply them to the assembly of, say, a child’s plastic
toy (if nothing else, the baking stage would reduce the toy
to a mangled mess). Cake-cooking instructions only have
meaning when applied to cake-making ingredients. So
too, the logistics solution to the cell division problem is
only relevant to the problem of cell division. If we applied
the logistics solution to the problem of mate attraction via
pheromones (scent) in moths it would not work. All the vast
amount of meta-information in the human genome only has
meaning when applied to the problem of using the human
genome to make, maintain and reproduce human beings.
Even if we granted that the first biological information
came into existence by a random process in an ‘RNAworld’
scenario, the meta-information needed to use that
information could not possibly come into existence by
the same random (independent) process because metainformation
is inextricably dependent upon the information
that it relates to.
There is thus no possible random (mutation) solution to
this conundrum. Can natural selection save the day? No.
There are at least 100 (and probably many more) bits of
meta-information in the human genome for every one bit of
primary (protein-coding gene) information. An organism
that has to manufacture, maintain and drag around with it
a mountain of useless mutations while waiting for a chance
correlation of relevance to occur so that something useful
can happen, is an organism that natural selection is going
to select against, not favour! Moreover, an organism that
can survive long enough to accumulate a mountain of
useless mutations is an organism that does not need useless
mutations—it must already have all the information it needs
to survive!
What kind organism already has all the information it
needs to survive? There is only one answer—an organism
that was created in the beginning with all that it needs to
survive.
What kind of information is this?
These results present us with a spectacle never
before encountered in science—an information
structure so complex that it defies description. How
can we possibly understand it?
I believe there are at least two things that
we can reasonably conclude about it at this time.
First, it is most decidedly not the one-dimensional
linear sequence of characters that neo-Darwinists
need to provide the endlessly mutable source of
universal evolution. While it is certainly variable,
its extraordinarily complex structure cannot
possibly be endlessly mutable because a certain
amount of invariance is required to maintain
complex structure, otherwise it quickly degrades
into error catastrophe. The experimentally verified
universally deleterious nature of mutations supports
this conclusion.
The second thing we can conclude is that
according to Kirschner and Gerhart’s theory of
facilitated variation,14 DNA contains regulatorybased
modules which they liken to Lego® blocks.
That is, they have very strong internal coherence
Meta information. The computer memory chip (A) can hold over 2 billion bits of
information in binary code (B) but it requires a computer (C) with its operating
system and application software (meta-information) to do anything useful with
the information on the chip. Likewise, the protein information coded on the
genes in DNA needs the meta-information in the ‘junk’ regions together with
the machinery of the cell to do anything useful with the protein code.
Image by Alex Williams
116 JOURNAL OF CREATION 21(3) 2007
Papers
and integrity (difficult to break) but are easily pulled apart
(during meiosis) and reassembled (during fertilization) to
produce a built-in capacity for variation. To understand
it properly, we need to locate the Lego® blocks and the
boundaries between them.
Natural selection—irrelevant!
The most surprising result of the ENCODE project,
according to its authors, is that 95% of the functional
transcripts (genic and UTR transcripts with at least one
known function) show no sign of selection pressure (i.e.
they are not noticeably conserved and are mutating at the
average rate). Why were they surprised? Because man
is supposed to have evolved from ape-like ancestors via
mutation and natural selection. But if 95% of the human
functional information shows no sign of natural selection
then it means that natural selection has not been a significant
contributor to our ancestry.15
While this result surprised the neo-Darwinists, it is
perfectly in line with current research in human genetics. In
his ground-breaking exposé of current knowledge, Genetic
Entropy & The Mystery of the Genome,16 geneticist Dr John
Sanford of Cornell University showed that deleterious
mutations are accumulating at an alarming rate in the human
population and that both natural selection and even the worst
possible nightmare scenario of eugenics is powerless to stop
it. This is because the vast majority are deleterious single
nucleotide mutations that have a miniscule effect on fitness
so natural selection cannot detect them amongst the ‘noise’
and thus is unable to delete them. Also, there is so much
other information contributing to
fitness, and life always has many
different ways of achieving its
goals. Natural selection is also
powerless to delete numerous
mutations simultaneously,
because their effects on fitness
are complex, interactive and
often mutually interfering.
Natural selection only works
in simple cases that are easy
to explain in textbooks. One
organism with one (big) defect
will fail to reproduce. Alternately,
one big strong male will outcompete
his rivals and fertilize
more females, thus improving the
overall genetic well-being of the
species. These kinds of scenarios
only have a minor impact on the
human population. Most of us are
just mediocre, but most of us do
find partners and procreate.
If the average number of
deleterious mutations per person
per generation is much less than
1, then we have some hope that natural selection might
favour the strong over the weak and keep our species
genetically healthy. But in fact, the average number of
deleterious mutations per person per generation is very
much greater than 1—probably at least 100 per person
per generation, likely about 300 and perhaps much more.
This means that everyone is a mutant many times over and
natural selection is powerless to stop it because deleting
the weakest (whatever that might mean) will do nothing
to prevent everyone else reproducing, resulting in an
inexorable degeneration of the whole human population.
Even the most horrible eugenics program could not stop it.
This problem caused one author to write a paper entitled
‘Why are we not dead 100 times over?’
Let’s now put this information together with Haldane’s
dilemma.17 Famous geneticist J.B.S. Haldane calculated
that it would take about 300 generations for a favourable
mutation to become fixed in a population (every member
having a double copy of it). He calculated that in the
approximately 6 million years since our supposed hominid
ancestor split from the chimpanzee line, only about 1000
(<2000 according to ReMine18) such mutations could
become fixed. This is certainly not nearly enough to turn
an ape into a human. But most importantly, we now know
that there are about 125 million single nucleotide differences
between humans and chimps, resulting from about 40
million mutational events. This means that somewhere
between 39,998,000 and 124,998,000 deleterious changes
have occurred since the split with our common ancestor.
That means we have degenerated from chimps, which
makes a mockery of the whole mutation/selection theory
of origin.
Information about information. The most basic meta-information needed to use coded information
is instructions on how to read the code. In a computer, the binary code (Panel A) is divided into
&#8216;bytes&#8217; made up of 8 &#8216;bits&#8217; (0,1), which are then translated into English via the ASCII code. In
protein-coding genes (Panel B), the nucleotides (T,A,G,C) are read in triplets, and these triplets
are translated into an amino acid sequence via the genetic code.
Image by Alex Williams
JOURNAL OF CREATION 21(3) 2007 117
Papers
Conclusion
The discovery that virtually all our DNA is functional
right now demolishes the neo-Darwinian argument that it
contains mostly junk which constitutes a unique fossilized
history of our genetic evolution. The most logical
explanation of shared DNA between different kinds of
organisms is now shared function, not shared ancestry.
The discovery that DNA contains astonishingly
complex information structures, beyond our capacity to even
imagine, let alone comprehend, means that neo-Darwinism&#8217;s
total dependence upon random mutations in linear code as
a source of new information is in error. We would expect
random mutations to extremely complex information
structures to be deleterious, and this is exactly the universal
result of many years of genetic research. Kirschner and
Gerhart&#8217;s theory that natural variation is largely the result
of random rearrangements of specially structured gene
modules14 is a far more reasonable explanation.
The discovery that the vast majority of the information
stored in DNA is not primary protein-coding information but
secondary meta-information, demolishes the neo-Darwinian
argument that it arose by some random (independent)
process. Meta-information is inextricably dependent upon
the information it refers to so an independent origin is
impossible.
The discovery that natural selection has been an
insignificant factor in our genetic history demolishes
Darwin&#8217;s theory that all life on Earth arose from a common
ancestor by means of the natural selection of natural
variation. This discovery is also completely consistent with
the genetic evidence that the human genome is degenerating
at an alarming rate.
Neo-Darwinian theory was a mirage built from ideology
upon a few general facts (natural selection, genes and
mutations). But now that we have glimpsed just some of
the vast molecular detail of how life actually works, the
mirage has evaporated.
References
1. This theory makes numerous predictions that are all falsified by molecular
biology data. The lack of any alternative source of the required new
information thus constitutes a fatal weakness in neo-Darwinian theory.
See Bergman, J., Does gene duplication provide the engine for evolution?
Journal of Creation 20(1):99&#8211;104, 2006; <Creation - Creation Ministries International
images/pdfs/tj/j20_1/j20_1_99-104.pdf>.
2. Carroll, S., The Making of the Fittest: DNA and the Ultimate Forensic
Evidence of Evolution, Norton, New York, 2006.
3. Dawkins, R., The Ancestor&#8217;s Tale: A Pilgrimage to the Dawn of Evolution,
Houghton Miflin, Boston, MA, 2004, p.20.
4. Birney, E. et al., Identification and analysis of functional elements in 1%
of the human genome by the ENCODE pilot project, Nature 447: 799&#8211;816,
2007.
5. Kapranov, P., Willingham, A.T. and Gingeras, T.R., Genome-wide
transcription and the implications for genomic organization, Nature
Reviews Genetics 8: 413&#8211;423, 2007.
6. Akhtar, A. and Gasser, S.M., The nuclear envelope and transcriptional
control, Nature Reviews Genetics 8:507&#8211;517, 2007.

7. Mutations have been intensively researched in medical laboratories
over many years now and almost all of them are deleterious. The only
exceptions appear to be those that disable an existing mechanism such
that some specialized result is beneficial in some specialized conditions
(e.g. sickle-cell anemia (a serious disease) reduces the severity of malaria
in malaria-prone regions by killing the malaria parasite in the sickled
cells).
8. Latent memory of cells comes to life, 17 May 2007, <www.physorg.
com/news98623491.html>.
9. Segal, E., Fondufe-Mittendorf, Y., Chen, L., Thåström, A, Field, Y.,
Moore, I.K., Wang, J.-P.Z. and Widom, J., A genomic code for nucleosome
positioning, Nature 442(7104):772&#8211;778, 17 August 2006.
10. Mikkelsen, T.S. et al., Genome-wide maps of chromatin state in pluripotent
and lineage-committed cells, Nature 448:553&#8211;560, 2007.
11. Bernstein, B.E. et al., A bivalent chromatin structure marks key
developmental genes in embryonic stem cells, Cell 125:315&#8211;326, 2006.
12. Aladjem, M.I., Replication in context: dynamic regulation of DNA
replication patterns in metazoans, Nature Reviews Genetics 8:588&#8211;60,
2007.
13. Kirschner, M.W. and Gerhart, J.C., The Plausibility of Life: Resolving
Darwin&#8217;s Dilemma, Yale University Press, New Haven, CT, 2005.
14. Kirschner and Gerhart, ref. 13. See also, Williams, A.R., Facilitated
Variation: A new paradigm emerges in biology, Journal of Creation, in
press.
15. Natural selection will have played some part in our ancestry because
grossly deleterious mutations will have been weeded out, and during
times of famine, plague and war, populations are usually depleted of some
genetic components more than others (e.g. young soldiers killed in war;
the young and the old dying in famine, sickle-cell anemia alleviating the
severity of malaria).
16. Sanford, J.C., Genetic Entropy & The Mystery of the Genome, Elim
Publishing, New York, 2005.
17. Batten, B., Haldane&#8217;s dilemma has not been solved, Journal of Creation
19(1):20&#8211;21, 2005; <www.creationontheweb.com/content/view/4473>.
18. ReMine, W., Haldane&#8217;s Dilemma, <saintpaulscience.com/Haldane.htm>,
3 August 2007.
Alex Williams received a B.Sc. in botany from the University
of New England, an M.Sc.(Hons.) in radioecology from
Macquarie University, and is an elected member of the
Australian Institute of Biology. He has diplomas in Christian
studies from Tabor College and Bible College of South
Australia (in missiology), and a Licentiate in Theology
(with distinction) from the Australian College of Theology.
During 20 years in environmental research, he became the
Australian representative to the United Nations in his field.
He then spent seven years in mission work and is now an
honorary botanist at the Western Australian Herbarium in
Perth. He has published in many different areas of science
 
Natural selection weeds out the weak, if you and I were out in the wild and you could run faster and a wolf chases us, I'm going to be eaten and you'll be able to pass on your beneficial traits that include running fast while my slow genes were the wolves dinner.

Your view is based on what the Bible says, not based on science.

That's why you see deep sea fish with eye sockets and eye buds but no functioning eyes, they evolved from fish working eyes but in the deep sea they no longer needed them and thus evolved the current trait of no working eyes.

I agree with your view on how natural selection works ,but how does it work biologically ? How does it know we need a heart,lungs,blood,veins,eye's,brain,and many other things that we need to have a productive life ?

What is the chance of a non-intelligent process knowing these things ? Are we gonna reason it just happened by chance ?

Wait, how do you know the fish that had no eye's did not need them ? How do you know that it was not just a result of a harmful mutation ?

It does not "know", thats what you dont get. The animals with beneficial mutations survive better, that is the mechanism by which it progresses. In animals with three chambered hearts, oxygened and unoxygenated blood mix in the ventricle. Thats very inefficient. So some animals received small mutations (variations in the expression of some protein, or whatever), that results in a small, partial separation of the ventricle. We see this in reptiles. This makes the respiratory system more efficient, it prevents the mixing of blood. These animals have a competitive advantage. This gradually develops into a fully separated ventricle; even more efficient and advantageous.

This is not a knowledgeable progression. Its just natural selection, thats what you dont get. The chance expression of one protein more than another can lead to a competitive advantage among a certain family of animals. Those animals overtake the entire population gradually and therefore natural selection has produced evolution.

Look at my post before this one.
 
Darwin himself said his theory of evolution was absurd ...That the human eye alone is so complex it could never evolve naturally ... Somehow Darwinist continue to find a way to ignore this fact ... If Darwin were alive today would consider believers in evolution morons...

Could we have a source? Because Darwin says otherwise in The Origin of Species.

Eyes are a particularly fascinating point of evolution. They're hardly proof for the existence of an intelligent designer any more than your own left arm is. Now what Darwin did to rebut this argument about eyes is particularly interesting. He surveyed existing species with functional and useful eyes and tried to see if he could string them into a hypothetical sequence that would produce the complex eyes we have today. And he did it.

You can start with simple eyespots that detect light in flatworms. From there, you can see how it would fold in, making a cup that protects this eyespot, and better localize the light source. Limpets have these sort of eyes. In chambered nautilus, you see the cup's opening narrowing which produces an improved image. In ragworms, the cup is covered by a transparent cover, in abalones, fluid in the eye forms a lens to help focus light. In mammals, nearby muscles help with focus and with moving the lens. And so on and so forth until you reach the current eye we ourselves have.

If that wasn't proof, there's a pair of Swedish scientists by the names of Dan-Eric Nilsson and Susanne Pelger who made a mathematical model concerning the evolution of eyes, similar to the path I just described. Starting with the basic eyespot, they allow the tissues to deform themselves randomly, limiting the change to about 1% of size or thickness at every step. To imitate natural selection, the model only accepted the mutations that would benefit the organism, as any that didn't would allow that organism to die off.

Well, the model worked, much like the progression that Darwin described. It went through 1,829 tiny adaptive steps. And how long did it take? Nilsson and Pelger calculated it, and it would take, using conservative numbers, less than 400,000 years. That's more than enough time for an eye to evolve in organisms, especially considering the earliest eyes started to arise more than 550 million years ago.


So, how is this proof of an intelligent designer again?

- Darwin -

Source - The Origin of Species-

Quote - "To suppose that the eye with all its inimitable contrivances for adjusting the focus to different distances, for admitting different amounts of light, and for the correction of spherical and chromatic aberration, could have been formed by natural selection, seems, I freely confess, absurd in the highest degree.&#8221;

That is a direct quote from the lips of Mr. Darwin himself...

Is that enough to convince you ..or are you only going to admit to the things Darwin said that fit you agenda.. I'm guessing the latter.

The rumor is he asked Jesus for forgiveness on his death bed by the way.....So I guess it would be safe to say that he very likely died a Christian...WOW
 
Last edited:
Darwin himself said his theory of evolution was absurd ...That the human eye alone is so complex it could never evolve naturally ... Somehow Darwinist continue to find a way to ignore this fact ... If Darwin were alive today would consider believers in evolution morons...

Could we have a source? Because Darwin says otherwise in The Origin of Species.

Eyes are a particularly fascinating point of evolution. They're hardly proof for the existence of an intelligent designer any more than your own left arm is. Now what Darwin did to rebut this argument about eyes is particularly interesting. He surveyed existing species with functional and useful eyes and tried to see if he could string them into a hypothetical sequence that would produce the complex eyes we have today. And he did it.

You can start with simple eyespots that detect light in flatworms. From there, you can see how it would fold in, making a cup that protects this eyespot, and better localize the light source. Limpets have these sort of eyes. In chambered nautilus, you see the cup's opening narrowing which produces an improved image. In ragworms, the cup is covered by a transparent cover, in abalones, fluid in the eye forms a lens to help focus light. In mammals, nearby muscles help with focus and with moving the lens. And so on and so forth until you reach the current eye we ourselves have.

If that wasn't proof, there's a pair of Swedish scientists by the names of Dan-Eric Nilsson and Susanne Pelger who made a mathematical model concerning the evolution of eyes, similar to the path I just described. Starting with the basic eyespot, they allow the tissues to deform themselves randomly, limiting the change to about 1% of size or thickness at every step. To imitate natural selection, the model only accepted the mutations that would benefit the organism, as any that didn't would allow that organism to die off.

Well, the model worked, much like the progression that Darwin described. It went through 1,829 tiny adaptive steps. And how long did it take? Nilsson and Pelger calculated it, and it would take, using conservative numbers, less than 400,000 years. That's more than enough time for an eye to evolve in organisms, especially considering the earliest eyes started to arise more than 550 million years ago.


So, how is this proof of an intelligent designer again?

- Darwin -

Source - The Origin of Species-

Quote - "To suppose that the eye with all its inimitable contrivances for adjusting the focus to different distances, for admitting different amounts of light, and for the correction of spherical and chromatic aberration, could have been formed by natural selection, seems, I freely confess, absurd in the highest degree.&#8221;

That is a direct quote from the lips of Mr. Darwin himself...

Is that enough to convince you ..or are you only going to admit to the things Darwin said that fit you agenda.. I'm guessing the latter.

The rumor is he asked Jesus for forgiveness on his death bed by the way.....So I guess it would be safe to say that he very likely died a Christian...WOW

I am not positive he died a Chritian because i read his daughters comments that was supposedly present at his death. I hope you're right.
 
I agree with your view on how natural selection works ,but how does it work biologically ? How does it know we need a heart,lungs,blood,veins,eye's,brain,and many other things that we need to have a productive life ?

What is the chance of a non-intelligent process knowing these things ? Are we gonna reason it just happened by chance ?

Wait, how do you know the fish that had no eye's did not need them ? How do you know that it was not just a result of a harmful mutation ?

It does not &quot;know&quot;, thats what you dont get. The animals with beneficial mutations survive better, that is the mechanism by which it progresses. In animals with three chambered hearts, oxygened and unoxygenated blood mix in the ventricle. Thats very inefficient. So some animals received small mutations (variations in the expression of some protein, or whatever), that results in a small, partial separation of the ventricle. We see this in reptiles. This makes the respiratory system more efficient, it prevents the mixing of blood. These animals have a competitive advantage. This gradually develops into a fully separated ventricle; even more efficient and advantageous.

This is not a knowledgeable progression. Its just natural selection, thats what you dont get. The chance expression of one protein more than another can lead to a competitive advantage among a certain family of animals. Those animals overtake the entire population gradually and therefore natural selection has produced evolution.

Look at my post before this one.

Do you mean the extremely biased creationist copy pasta you posted?
 
The traditional understanding of DNA has recently been transformed beyond recognition. DNA does not, as
we thought, carry a linear, one-dimensional, one-way, sequential code&#8212;like the lines of letters and words on
this page. And the 97% in humans that does not carry protein-coding genes is not, as many people thought,
fossilized &#8216;junk&#8217; left over from our evolutionary ancestors. DNA information is overlapping-multi-layered and
multi-dimensional; it reads both backwards and forwards; and the &#8216;junk&#8217; is far more functional than the protein
code, so there is no fossilized history of evolution. No human engineer has ever even imagined, let alone
designed an information storage device anything like it. Moreover, the vast majority of its content is metainformation&#8212;
information about how to use information. Meta-information cannot arise by chance because
it only makes sense in context of the information it relates to. Finally, 95% of its functional information shows
no sign of having been naturally selected; on the contrary, it is rapidly degenerating! That means Darwin was
wrong&#8212;natural selection of natural variation does not explain the variety of life on Earth.


I would like to address that part specifically. For one, we know exactly how DNA stores information. Sequences of bases code for an amino acid chain. Your article correctly identifies start and stop codons, congrats. Your article is extremely misleading. It points of facts, its correct a lot of the time, but its misleading you the entire time. You claim that 98% of the human genome is &quot;junk DNA&quot;, and your wrong. The main argument is that scientists refer to this as junk DNA, but its not. The correct term is noncoding DNA, because they arent junk. They control expression of genes, include ribosomal RNA, and sometimes are several duplicates of the same gene. Metainformation, you would call it. Your article correctly states that too. The fact that 98% of RNA doesnt directly code protein has nothing to do with intelligent design. But the really interesting part is the last part. How exactly is human DNA, or any organisms DNA, &quot;deteriorating&quot;? Furthermore, explain what exactly you mean by deteriorating. Will human kind degenerate into a mutated mess? There is no such thing as DNA &quot;deterioration&quot;. An entire population does not simply genetically deteriorate.
 
Last edited:
It does not &quot;know&quot;, thats what you dont get. The animals with beneficial mutations survive better, that is the mechanism by which it progresses. In animals with three chambered hearts, oxygened and unoxygenated blood mix in the ventricle. Thats very inefficient. So some animals received small mutations (variations in the expression of some protein, or whatever), that results in a small, partial separation of the ventricle. We see this in reptiles. This makes the respiratory system more efficient, it prevents the mixing of blood. These animals have a competitive advantage. This gradually develops into a fully separated ventricle; even more efficient and advantageous.

This is not a knowledgeable progression. Its just natural selection, thats what you dont get. The chance expression of one protein more than another can lead to a competitive advantage among a certain family of animals. Those animals overtake the entire population gradually and therefore natural selection has produced evolution.

Look at my post before this one.

Do you mean the extremely biased creationist copy pasta you posted?


For a list of the participants on the encode project go here.

Genome.gov | ENCODE Participants and Projects

Here is the debate on the findings of the encode project. Your guy gives opinions and does not go in to detail of what he thinks of the findings.

The Great Dothan Debate

Do Humans Have an Evolutionary Origin?

Published: 6 April 2009(GMT+10)

On Nov 27, 2007, in front of a packed house, CMI&#8217;s Dr. Robert Carter debated Mr. Rick Pierson in Dothan, Alabama, on the subject &#8220;Do Humans Have an Evolutionary Origin?&#8221;

Each debater gave introductory comments, then a rebuttal of the other&#8217;s comments. They then asked three open-ended cross-examination questions each before wrapping up with closing comments.

Carter&#8217;s opening statement

Carter opened by frankly stating that evolution is impossible, and that one must believe in the miracle of the big bang and the miracle of spontaneous generation in order to hold to evolution. He made the case that, while evolution depends on life being simple, life is complex at all levels. After pointing out the &#8220;chicken and egg&#8221; problem of the necessity of the DNA repair and copying mechanism to be coded into the DNA before DNA can exist in the cell, he stated that &#8220;complexity&#8221; is the Achilles&#8217; heel of evolution. He then brought up the idea of &#8220;information&#8221; and stated that information is the nail in the coffin of evolutionary theory. He split the definition of evolution, disagreeing that it simply means &#8220;change&#8221;, for the creationist believes in change and yet does not believe in evolution. In fact, it was a creationist, Edward Blyth, who first came up with the idea of natural selection. (See Don&#8217;t fall for the bait and switch.) He concluded with an appeal to the audience to put everything they were about to hear into the context of these introductory comments, for this is a debate about worldview, not scientific data.

Pierson&#8217;s opening statement

Pierson started by listing what he considered the eight best evidences for human evolutionary origins, but he mentioned several of these only briefly and did not go into much detail. He focused on three of these eight as his main arguments. His first argument was the existence of &#8220;pseudogenes&#8221;, their number, and the fact that humans, chimps, and gorillas share several &#8220;lesions&#8221; that render an otherwise functional gene inoperable. His second argument was that of the reputed fusion of two ancestral chimp chromosomes to produce human chromosome 2. His third argument was for the retention of ancestral embryonic structures. If embryos from diverse organisms all follow the same development pattern, he claimed this would be evidence for common ancestry. He brought up the existence of &#8220;pharyngeal slits&#8221; (&#8220;gill slits&#8221;), and the presence of &#8220;aortic arches&#8221; (supposedly harking back to our fish ancestors) in the human embryo. He mentioned the fossil record of hominids, but ran out of time before saying any more on the subject.

Carter&#8217;s rebuttal to Pierson&#8217;s opening statement

Carter rebutted each of Pierson&#8217;s main arguments. He pointed out that the 99% similarity between human and chimp is an outdated idea (see &#8220;What about the similarities between monkey and human DNA?&#8221; in Genetics Questions and Answers). He then said that the idea of pseudogenes1 is also outdated. He discussed &#8220;Haldane&#8217;s Dilemma&#8221; (see Haldane s dilemma has not been solved) and how it led to the idea of &#8220;Junk DNA&#8221; (see Vestigial Organs Questions and Answers), but that the ENCODE Project (see Astonishing DNA complexity update) demolished the idea of Junk DNA and makes Haldane&#8217;s Dilemma much worse. ENCODE showed that 99% of the human genome is functional, contrary to old expectations (based on Junk DNA theory) that 97% was non-functional, with massive amounts of RNA transcription occurring along those supposedly junk sections of DNA. He went on to point out that the pseudogene argument is akin to the old vestigial organ argument ( Vestigial Organs Questions and Answers). As an argument from silence (one of the classic logical fallacies), it is not a sound scientific statement.

Carter went on to discuss the fusion hypothesis for the origin of human chromosome 2. There is a diversity of opinion within the creationist community about whether or not it actually happened, yet it proves nothing for the evolutionist. They claim common descent because one of our chromosomes looks like two of the ape chromosomes. But they would also claim common descent if we had the exact same number of chromosomes. Also, if it were true, it was a near-extinction event for humanity.

On the subject of homology, Carter brought up Haeckel&#8217;s fraudulent drawings (see Ernst Haeckel: Evangelist for evolution and apostle of deceit) and compared them to those from a recent paper.2,3 He claimed that an embryologist can tell the difference between the various organisms, but a lot depends on the developmental state of the embryos. He asserted that there are no gill slits in the human embryo and that the pharyngeal pouches perform no respiratory function. He showed that the &#8220;post-anal tail&#8221; is a misnomer and why it only appears to be a tail, as embryologists know (see Embryonic Recapitulation and Similarities Questions and Answers).

Carter broached the subject of hominids, but only managed to state they were fully human (see Anthropology and Apemen Questions and Answers) before time ran out.

Pierson rebuttal to Carter&#8217;s opening statement

Pierson dismissed Carter&#8217;s statement that the evolutionist must believe in miracles, specifically the two Carter brought up. He claimed the big bang was not evolution, neither was the abiogenic origin of life. He claimed that evolution was simply "biological change" [Carter warned about this deceptive definition in his opening comments].

Regarding the origin of life, Pierson claimed the RNA World hypothesis is a suitable explanation, using RNA&#8217;s centrality to life and the experimental formation of ribosome ligases from random chemical reactions to back up his claim. He also correctly outlined the difference between spontaneous generation and abiogenesis [see Origin of Life Questions and Answers].

He went on to discuss the big bang, giving a list of evidences, including the expanding universe, red shifts, experimental verification of the predicted amounts of helium and hydrogen in the universe, and the presence of the cosmic microwave background radiation (CMB). He also introduced the inflationary big bang model, claiming it solved three problems in the older model. [Apparently, he is unaware of the groundbreaking work by two CMI scientists on the subject. See &#8220;What are some of the problems with the &#8216;big bang&#8217; hypothesis?&#8221; in Astronomy and Astrophysics Questions and Answers.]

He then attempted to discount Carter&#8217;s analogy of computer parts randomly coming together to form a functional computer, claiming that a single replicating molecule can be both hardware and software. [For a detailed rebuttal, see Information Theory Questions and Answers.]

In an attempt to discredit Christianity, he briefly brought up ethics, referring to the Old Testament and mentioning mass murder before running out of time.

Cross-examination questions

1. Carter: &#8220;Can you please give the audience a reasonable process whereby random natural events can produce the necessary but non-random information needed for evolution, including the evolution of new genes and new, complex biochemical pathways?&#8221;

Pierson&#8217;s answer to the origin of information dealt with gene duplication (giving trichromatic vision in old world monkeys as an example), and exon shuffling (e.g., serine proteases in the blood coagulation pathway). He claimed that a complex biological process like the production of the bacterial flagellum can be explained easily since the 24 core proteins are almost all slightly modified duplicates of one another (see Germ&#8217;s miniature motor has a clutch).

2. Pierson: He started his question with the claim that Carter stated in a talk the previous evening that there were no transitional fossils between reptiles and mammals or between fish and amphibians. He referenced a book that mentioned the Tiktaalik fossil and other similar species, then asked why Carter did not mention them during his talk?

Carter replied that he did not say there were &#8220;no&#8221; transitional fossils [The majority of the audience had not been present at his talk the previous evening so there was no way to verify either statement. There was also no way to see the context in which this statement was made.]. He used Pakicetus, once thought to be perfectly intermediate between land and aquatic mammals, as an example and showed how it was determined later to have no aquatic features when more of the skeleton was discovered. He showed a slide that illustrated the diversity within the ceratopsid dinosaur group and compared it to the fossil record of the rhinoceros. He showed another slide of an evolutionary tree and claimed it lacked the predicted links, that the ones used today were not used earlier, and that the transitional status of claimed intermediates is always debatable. Darwin wanted innumerable transitional fossils, but we only find a few and there is no evidence to fill in the largest gaps, which are represented by the fewest number of fossils. This is contrary to Darwin and contrary to evolutionary predictions. He finished by using the various families of stony coral families as an additional example of species stasis. (see &#8220;Are there really missing links?&#8221; in Fossils Questions and Answers) (For specific details about Tiktaalik and other similar species, see Panderichthys a fish with fingers? and the list of related article links on the bottom of that page.)

3. Carter: After referencing the ENCODE Project and saying it kills the idea of junk DNA, he asked, &#8220;With only 100,000 generations since the human-chimp split, would you please explain one of the following: a) The fixation of the 5 million indel mutations covering 90 million base pairs that separate our species; b) the fixation of the 35 million point mutations; or c) the fixation of the approximately 1,000 beneficial mutations that might have accumulated in our two separate lineages since we diverged?&#8221; He stated these were intractable problems and that there has not been enough time in evolutionary history to account for all these changes.

Pierson struggled giving his answer. He started by claiming only 5&#8211;8% of the genome is under functional evolutionary constraint [This comes from a separate paper4 that compared the human, mouse, and rat genomes and is not part of the ENCODE results. It is also based on evolutionary assumptions of common descent and the idea that long stretches of DNA shared by diverse organisms must be functionally constrained.], and that the rest can accumulate mutations at will. He brought up the results of another study in which megabases were removed from the mouse genome, with no ill effects to the mice [This was a tremendous surprise to evolutionists,5,6 who expected strongly conserved sequences to be highly functional.]. He stated only 1.5&#8211;2% of the human genome is protein coding, and that most of the genome is composed of repetitive sequences [this avoided the question that had been asked]. In saying this he showed that he was not up to date with the science (e.g., the ENCODE project outlined earlier by Carter).

In an attempt to rebut the &#8216;fixation of mutations&#8217; question, he mentioned Haldane&#8217;s Dilemma and claimed it was flawed. He also mentioned James [sic, Walter] ReMine&#8217;s book, The Biotic Message, and claimed the author made invalid assumptions [see Carter&#8217;s rebuttal below]. Thus, he claimed, Haldane&#8217;s Dilemma is no longer a problem.

As far as the 35 million point mutations that have been fixed in the few generations since humans and chimps split, he said, &#8220;You can&#8217;t answer everything,&#8221; then noted that his lack of an answer looks bad. [Note: this is the crux of Haldane&#8217;s Dilemma and he did not have the ability to address the details.]

To address the assumed fixation of 1,000 beneficial mutations, he said something about different selective pressures in different environments, then split the 1,000 figure to include only 500 per lineage. [The careful observer would have noted he was only asked to answer one of the three possibilities and that he did not answer any!]

4. Pierson: This next question rambled a little before he got to the point. He started off with the claim that CMI takes Genesis as a science book instead of allegory [this is not an accurate portrayal of our position, see What we believe and The Bible and hermeneutics]. He then said that Genesis makes only a few predictions, including that man is made from the dust of the earth, which would mean man is made of silicon dioxide, and that men should have one less rib than woman, which he (correctly) claimed is also false, and that if we are created in God&#8217;s image, and if you take this literally, then God looks like us, &#8220;with two eyes, and all this kind of stuff.&#8221; For his question, he asked why the two predictions the Bible makes about humans are not correct, and if Carter didn&#8217;t think God looks like us, &#8220;anthropomorphic, with eyes, legs, and that kind of stuff,&#8221; then isn&#8217;t he &#8220;interpreting that part where it says we are created in God&#8217;s image, instead of taking it literally?&#8221;

Carter dismissed part of this as not being germane to the debate. Whether or not man is made in the image of God is philosophy and does not pertain to the debate at hand. Also, the definition of the word &#8220;image&#8221; is a religious question (see Made in the image of God).

He disagreed that the Bible taught men should have one less rib than women. He said even primitive man knew that if you lost your finger, your future children would still be born with 10 fingers [thus, Pierson&#8217;s assertion is actually a straw man argument, one of the classic logical fallacies]. He also pointed out that the rib is the only bone that will re-grow when removed (see Regenerating ribs).

He went on to give additional examples of predictions that can be derived from Genesis that Pierson failed to mention. He discussed the evolutionary tale of human migration out of Africa, as tracked by mitochondrial DNA, and how this depends on the assumptions underlying the neutral model of evolution. He then claimed the data better support the Tower of Babel story, with a single migration of humanity, out of the Middle East, traveling in small people groups, from a small original population, into uninhabited territory, in the recent past. He said the three main mitochondrial lineages found throughout the world might very well correlate to the three daughters-in-law of Noah, and that the geographic localization of Y chromosome haplotypes does not make much sense evolutionarily, but easily reflects the fact that there was only one Y chromosome on the Ark.

5. Carter: This question dealt with a recent paper that claimed the earliest fossil apes walked upright and that modern apes lost the ability. The claim that apes walked upright 21 million years ago disagrees with 150 years of evolutionary storytelling and &#8220;throws it in the trash.&#8221; His question: &#8220;Please give the audience, based on this new information, a plausible story on the origin of humans.&#8221;

Pierson did not see why this was a problem, for it did not counter the pseudogene argument, nor did it deal with embryonic similarities. He dismissed the question by stating that walking is a behavior and that the question does not deal with DNA similarities. He said we already knew that Australopithecus afarensis walked upright 3&#8211;4 million years ago and that science progresses, sometimes making mistakes, and that just because something is published does not make it a fact. He saw no reason why pushing back walking a few million years was a problem and then noted that Carter does not believe in the dates to which he was referring anyway. [We suppose he failed to realize that the reputed date of upright walking is before the human-chimp divergence and, if it turns out to be true, the whole evolutionary story of the origin of man is up in the air.]

6. Pierson: The final question was another rambling statement and returned to the pseudogene argument. Pierson claimed that pseudogenes can no longer perform their original function [note that this is an assumption], due to frame shift mutations, premature stop codons, and initiator sequence mutations. He claimed Carter believed all pseudogenes to be functional [he doesn&#8217;t], and that, if so, we have 20,000 pseudogenes that are functional in our genome that arose by mutation. He claimed this destroys all arguments based on information and that it is easy to add information to a genome by just throwing mutations at it. He also claimed this destroys all arguments about specified complexity, all arguments about information, because information is &#8220;trivially easy&#8221;, and also the &#8220;isolated great gap between functional DNA sequences&#8221;.

To address the pseudogene idea, Carter stated the old adage that &#8220;absence of evidence is not evidence of absence&#8221; and that just because we do not know the function does not mean they do not have a function. He said pseudogenes possibly have many functions, including the ability to translate (sic, &#8220;be transcribed into&#8221;) RNA that can then bind to the gene in the target area and repress it. He gave an example of a supposed pseudogene that humans, chimps and gorillas share. He then mentioned a common rule of thumb in biology that &#8220;form follows function&#8221;. If pseudogenes have a form, he argued, and this form is consistent among lineages that supposedly diverged 10 million years ago, the sequence in question certainly has a function (see Potentially decisive evidence against pseudogene shared mistakes ).

Regarding the results of the ENCODE project that Pierson attempted to dismiss, Carter said the purpose of ENCODE was to test how much of the genome was functional and that they found massive amounts of transcription all over the genome. To make matters even worse, they found many places with overlapping RNA codes that do completely different things. The conclusion was that the bulk of what was considered to be junk DNA is now known not just to be functional, but polyfunctional. He stated that evolution and natural selection cannot handle polyfunctionality and that the ENCODE Project answers the question of pseudogenes.

Pierson closing statement

Pierson returned to his main argument that pseudogenes prove the evolutionary origin of man. He said that transcription does not equate to biological function and that you can&#8217;t say something has a function until you find it [This violates the general rule of biology that form follows function, as mentioned earlier]. He said the ENCODE Project found that a majority of DNA is transcribed, not that it has a function [We feel that this begs the question]. That only 5&#8211;8% of the genome is under functional constraint. He then backed up Carter&#8217;s claim that the pseudogene shared between human, chimp, and gorilla discussed earlier may be an example of functional constraint, but that it is only one example and the rest of the genome is free to accumulate mutation since it is mostly repetitious. He claimed this refutes the specified complexity argument and the information argument, because these repetitious sequences have no meaning. He restated his belief that it is easy to add information to a genome.

He went on to discuss the efficiency of information storage in the DNA molecule and that human technology can store data more compactly and that computers can copy information much faster than the cell. Even though DNA is amazing, he claimed, &#8220;to put it up on a pedestal above what humans can do is not accurate.&#8221; [Note: nobody was arguing this point, but humans have achieved this information storage technology using intelligence; evolutionists claim that the incredible information storage system of DNA came about without any intelligence.]

Returning once again to the pseudogene argument, he claimed shared mutations are a basic prediction of evolutionary theory [true], and that the opposing view that CMI presents is independent origins [not true, he is assuming no design and no function for pseudogenes]. He then went into a probability argument before bringing up a few additional examples, including an apparent shared mutation that prevents humans and great apes from synthesizing vitamin C (see Why the shared mutations in the Hominidae exon X GULO pseudogene are not evidence for common descent).

He took exception to Carter&#8217;s claim that the 96&#8211;99% identity value between chimps and humans is invalid and cited several measures CMI has reported on their website [but much of what he said was the point CMI was trying to make in the article] (see Human/chimp DNA similarity, >98% Chimp/human DNA similarity? Not any more, and Decoding the dogma of DNA similarity).

He wrapped up his closing statement by attempting to restate several things Carter said in the debate and in the prior evening&#8217;s talk, asking if evolution is racist and whether or not you are immoral if you believe it [see comments by Carter below]. He ended by saying that evolution simply describes the world; it says what is, not what should be.

Carter&#8217;s closing statement

Carter began by tying up some loose ends. He returned to the ENCODE argument, saying that the fact that so much RNA transcription occurs is an indication of function, for evolution would predict some degree of efficiency and it would make much more sense if the useless transcription eventually got turned off. He agreed that the genome has much repetitive material, but that does not mean it is useless info. In fact, he said, repeats are probably structural.

He agreed that the human and chimp genomes are similar, but that there are many portions we cannot align and there are many genes we do not share. These genes cannot be explained by evolution because there is not enough time to evolve them. Getting back to duplication as an engine for evolution, he said duplicated genes should tend to be weeded out over time. They should be destroyed if they have no function. He also brought up the existence of evolutionary modeling programs (see From ape to man via genetic meltdown: a theory in crisis) and concluded that our genomes are doomed to extinction due to the accumulation of deleterious mutations that natural selection cannot weed out.








Carter: &#8216;there is nothing to say that raping and pillaging is wrong, for there is no higher power in evolutionary theory to whom one can appeal&#8217;





To address Pierson&#8217;s claim that Carter said evolution is immoral, he countered by saying that there is no judge between right and wrong in evolutionary theory and so the proper term is amoral. He raised the example of Genghis Khan, an evolutionary success story by all measures since he is the ancestor to 1 out of every 200 people alive today7 , and pointed out that there is nothing to say that raping and pillaging is wrong, for there is no higher power in evolutionary theory to whom one can appeal (see Morality and Ethics Questions and Answers).

He claimed Haldane&#8217;s dilemma was not the consequence of errors in his work on this, which work was actually used to generate the idea of junk DNA. He also said he had had personal conversation with Walter ReMine, who explained to him what his detractors have said about his work is wrong (see The Biotic Message: Evolution versus Message Theory).

He noted that Pierson did not bring up the creationist position often [and it seems he misstated it when he did] and how he (Carter) tried to stay on the evolutionists&#8217; turf to make a point. He pointed out how Pierson tried to stay away from origins, how Darwinian evolution fails mathematically, and then discussed the fact that evolution is really a smokescreen for the worldview battle that is raging behind it. In fact, Darwin and his friends rejoiced that they were attacking Christianity. He then gave his personal testimony about how creationist analysis helped him keep his faith while in his undergraduate training.

Briefly, he addressed the big bang, claiming inflation theory is a magic wand and that it defies all known laws of physics. He wrapped this up with this quote, &#8220;If you have to resort to unknown mechanisms to explain the most important part of your theory, one wonders how solid your theory is in the first place.&#8221;

Regarding human evolution [the topic of the debate], he stated his belief that we came from recent origins, and gave a list of evidences from genetics.

In his final few minutes, Carter encouraged the audience to dig deeper into the subject and asked them not to turn away from God if they heard him calling.

Related articles
Splicing and dicing the human genome: Scientists begin to unravel the splicing code
CMI scientific blunder?










Related resources


Dothan Creation/Evolution Debate DVD, The Great


The proponents of evolutionary theory are no longer able to dismiss creationists with a wave of their academic hand. There is a growing list of scientists who believe in Genesis creation that refutes the often repeated and unsubstantiated claim that &#8216;virtually every scientist in the world believes the theory to be true&#8217; and that a person who does not believe in evolution cannot be a &#8216;legitimate scientist&#8217;. This public debate, between skeptic and vocal public anticreationist Mr Rick Pierson and Dr Robert Carter, a scientist with Creation Ministries International in Atlanta, Georgia, took place in front of a packed gallery in Dothan, Alabama. (Junior High&#8211;Adult) 103 min.


References
1.Type pseudogenes into the search engine window on this site (creation.com). Return to text.
2.Pennisi, E. 1997. Haeckel&#8217;s embryos: fraud rediscovered. Science 277(5331):1435. Return to text.
3.Richardson et al. 1997. There is no highly conserved embryonic stage in the vertebrates: implications for current theories of evolution and development. Anat. Embryol. 196:91&#8211;106. Return to text.
4.Cooper, G.M. 2004. Characterization of evolutionary rates and constraints in three mammalian genomes, Genome Res. 14:539&#8211;548. Return to text.
5.Ahituv N., et al. 2007. Deletion of ultraconserved elements yields viable mice. PLoS Biology 5(9):1906&#8211;1911. Return to text.
6.Gross, L. 2007. Are &#8220;Ultraconserved&#8221; genetic elements really indispensable? PLoS Biology 5(9):1839. Return to text.
7.Zerjal, T., et al. 2003. The genetic legacy of the mongols. Am. J. Hum. Genet. 72:717&#8211;721. See also &#8220;Genghis Khan a Prolific Lover, DNA Data Implies&#8221; at Genghis Khan a Prolific Lover, DNA Data Implies. Return to text.


The Great Dothan Debate
 
The traditional understanding of DNA has recently been transformed beyond recognition. DNA does not, as
we thought, carry a linear, one-dimensional, one-way, sequential code—like the lines of letters and words on
this page. And the 97% in humans that does not carry protein-coding genes is not, as many people thought,
fossilized ‘junk’ left over from our evolutionary ancestors. DNA information is overlapping-multi-layered and
multi-dimensional; it reads both backwards and forwards; and the ‘junk’ is far more functional than the protein
code, so there is no fossilized history of evolution. No human engineer has ever even imagined, let alone
designed an information storage device anything like it. Moreover, the vast majority of its content is metainformation—
information about how to use information. Meta-information cannot arise by chance because
it only makes sense in context of the information it relates to. Finally, 95% of its functional information shows
no sign of having been naturally selected; on the contrary, it is rapidly degenerating! That means Darwin was
wrong—natural selection of natural variation does not explain the variety of life on Earth.


I would like to address that part specifically. For one, we know exactly how DNA stores information. Sequences of bases code for an amino acid chain. Your article correctly identifies start and stop codons, congrats. Your article is extremely misleading. It points of facts, its correct a lot of the time, but its misleading you the entire time. You claim that 98% of the human genome is &quot;junk DNA&quot;, and your wrong. The main argument is that scientists refer to this as junk DNA, but its not. The correct term is noncoding DNA, because they arent junk. They control expression of genes, include ribosomal RNA, and sometimes are several duplicates of the same gene. Metainformation, you would call it. Your article correctly states that too. The fact that 98% of RNA doesnt directly code protein has nothing to do with intelligent design. But the really interesting part is the last part. How exactly is human DNA, or any organisms DNA, &quot;deteriorating&quot;? Furthermore, explain what exactly you mean by deteriorating. Will human kind degenerate into a mutated mess? There is no such thing as DNA &quot;deterioration&quot;. An entire population does not simply genetically deteriorate.

The mutation rate we see at this time matches the bibles claim man has been on the earth between 6,000 to 10,000 years.

If mutations continued at the current rate for each generation that has been observed,it means it would have taken man 6 billion years to diverge. Evolutionist can't seem to pin it down when it happened but they say they can trace us back to a certain period through our DNA. Would they not be able to pinpoint when we diverged from ape like creatures ? so they say we diverged somewhere between 5 to 20 million years ago. It's not possible as has been shown.

If we mutated by the rate the evolutionist need to show man diverged from our nearest ancestor 5 to 20 million years ago, we would go extinct in just 100 thousand years. By the effects mutations have on any organism. That is just to many mutations for any orgism to survive.

It was a stretch of the imagination and even more of a stretch when we do the math. The more we learn about the information contained in our chromosomes the harder it is for Neo to stand up to scrutiny.
 
The traditional understanding of DNA has recently been transformed beyond recognition. DNA does not, as
we thought, carry a linear, one-dimensional, one-way, sequential code—like the lines of letters and words on
this page. And the 97% in humans that does not carry protein-coding genes is not, as many people thought,
fossilized ‘junk’ left over from our evolutionary ancestors. DNA information is overlapping-multi-layered and
multi-dimensional; it reads both backwards and forwards; and the ‘junk’ is far more functional than the protein
code, so there is no fossilized history of evolution. No human engineer has ever even imagined, let alone
designed an information storage device anything like it. Moreover, the vast majority of its content is metainformation—
information about how to use information. Meta-information cannot arise by chance because
it only makes sense in context of the information it relates to. Finally, 95% of its functional information shows
no sign of having been naturally selected; on the contrary, it is rapidly degenerating! That means Darwin was
wrong—natural selection of natural variation does not explain the variety of life on Earth.


I would like to address that part specifically. For one, we know exactly how DNA stores information. Sequences of bases code for an amino acid chain. Your article correctly identifies start and stop codons, congrats. Your article is extremely misleading. It points of facts, its correct a lot of the time, but its misleading you the entire time. You claim that 98% of the human genome is &quot;junk DNA&quot;, and your wrong. The main argument is that scientists refer to this as junk DNA, but its not. The correct term is noncoding DNA, because they arent junk. They control expression of genes, include ribosomal RNA, and sometimes are several duplicates of the same gene. Metainformation, you would call it. Your article correctly states that too. The fact that 98% of RNA doesnt directly code protein has nothing to do with intelligent design. But the really interesting part is the last part. How exactly is human DNA, or any organisms DNA, &quot;deteriorating&quot;? Furthermore, explain what exactly you mean by deteriorating. Will human kind degenerate into a mutated mess? There is no such thing as DNA &quot;deterioration&quot;. An entire population does not simply genetically deteriorate.

If a gene pool gets to small it most certainly can happen. I don't think you realize how damaging mutations can be.
 
Darwin himself said his theory of evolution was absurd ...That the human eye alone is so complex it could never evolve naturally ... Somehow Darwinist continue to find a way to ignore this fact ... If Darwin were alive today would consider believers in evolution morons...

Could we have a source? Because Darwin says otherwise in The Origin of Species.

Eyes are a particularly fascinating point of evolution. They're hardly proof for the existence of an intelligent designer any more than your own left arm is. Now what Darwin did to rebut this argument about eyes is particularly interesting. He surveyed existing species with functional and useful eyes and tried to see if he could string them into a hypothetical sequence that would produce the complex eyes we have today. And he did it.

You can start with simple eyespots that detect light in flatworms. From there, you can see how it would fold in, making a cup that protects this eyespot, and better localize the light source. Limpets have these sort of eyes. In chambered nautilus, you see the cup's opening narrowing which produces an improved image. In ragworms, the cup is covered by a transparent cover, in abalones, fluid in the eye forms a lens to help focus light. In mammals, nearby muscles help with focus and with moving the lens. And so on and so forth until you reach the current eye we ourselves have.

If that wasn't proof, there's a pair of Swedish scientists by the names of Dan-Eric Nilsson and Susanne Pelger who made a mathematical model concerning the evolution of eyes, similar to the path I just described. Starting with the basic eyespot, they allow the tissues to deform themselves randomly, limiting the change to about 1% of size or thickness at every step. To imitate natural selection, the model only accepted the mutations that would benefit the organism, as any that didn't would allow that organism to die off.

Well, the model worked, much like the progression that Darwin described. It went through 1,829 tiny adaptive steps. And how long did it take? Nilsson and Pelger calculated it, and it would take, using conservative numbers, less than 400,000 years. That's more than enough time for an eye to evolve in organisms, especially considering the earliest eyes started to arise more than 550 million years ago.


So, how is this proof of an intelligent designer again?

- Darwin -

Source - The Origin of Species-

Quote - "To suppose that the eye with all its inimitable contrivances for adjusting the focus to different distances, for admitting different amounts of light, and for the correction of spherical and chromatic aberration, could have been formed by natural selection, seems, I freely confess, absurd in the highest degree.&#8221;

That is a direct quote from the lips of Mr. Darwin himself...

Is that enough to convince you ..or are you only going to admit to the things Darwin said that fit you agenda.. I'm guessing the latter.

The rumor is he asked Jesus for forgiveness on his death bed by the way.....So I guess it would be safe to say that he very likely died a Christian...WOW

LOL. Okay sure. Now stop lying and be honest and post the rest of what he said in that passage. No? Okay, I will.

To suppose that the eye with all its inimitable contrivances for adjusting the focus to different distances, for admitting different amounts of light, and for the correction of spherical and chromatic aberration, could have been formed by natural selection, seems, I freely confess, absurd in the highest degree.

Yet reason tells me, that if numerous gradations from a perfect and complex eye to one very imperfect and simple, each grade being useful to its possessor, can be shown to exist; if further, the eye does vary ever so slightly, and the variations be inherited, which is certainly the case; and if any variation or modification in the organ be ever useful to an animal under changing conditions of life, then the difficulty of believing that a perfect and complex eye could be formed by natural selection, though insuperable by our imagination, can hardly be considered real.

Is that enough to convince you? Or are you going to quote mine and take quotes out of context to make Darwin look like the dumbfuck creationist he certainly wasn't.
 
Last edited:
Look at my post before this one.

Do you mean the extremely biased creationist copy pasta you posted?


For a list of the participants on the encode project go here.

Genome.gov | ENCODE Participants and Projects

Here is the debate on the findings of the encode project. Your guy gives opinions and does not go in to detail of what he thinks of the findings.

The Great Dothan Debate

Do Humans Have an Evolutionary Origin?

Published: 6 April 2009(GMT+10)

On Nov 27, 2007, in front of a packed house, CMI&#8217;s Dr. Robert Carter debated Mr. Rick Pierson in Dothan, Alabama, on the subject &#8220;Do Humans Have an Evolutionary Origin?&#8221;

Each debater gave introductory comments, then a rebuttal of the other&#8217;s comments. They then asked three open-ended cross-examination questions each before wrapping up with closing comments.

Carter&#8217;s opening statement

Carter opened by frankly stating that evolution is impossible, and that one must believe in the miracle of the big bang and the miracle of spontaneous generation in order to hold to evolution. He made the case that, while evolution depends on life being simple, life is complex at all levels. After pointing out the &#8220;chicken and egg&#8221; problem of the necessity of the DNA repair and copying mechanism to be coded into the DNA before DNA can exist in the cell, he stated that &#8220;complexity&#8221; is the Achilles&#8217; heel of evolution. He then brought up the idea of &#8220;information&#8221; and stated that information is the nail in the coffin of evolutionary theory. He split the definition of evolution, disagreeing that it simply means &#8220;change&#8221;, for the creationist believes in change and yet does not believe in evolution. In fact, it was a creationist, Edward Blyth, who first came up with the idea of natural selection. (See Don&#8217;t fall for the bait and switch.) He concluded with an appeal to the audience to put everything they were about to hear into the context of these introductory comments, for this is a debate about worldview, not scientific data.

Pierson&#8217;s opening statement

Pierson started by listing what he considered the eight best evidences for human evolutionary origins, but he mentioned several of these only briefly and did not go into much detail. He focused on three of these eight as his main arguments. His first argument was the existence of &#8220;pseudogenes&#8221;, their number, and the fact that humans, chimps, and gorillas share several &#8220;lesions&#8221; that render an otherwise functional gene inoperable. His second argument was that of the reputed fusion of two ancestral chimp chromosomes to produce human chromosome 2. His third argument was for the retention of ancestral embryonic structures. If embryos from diverse organisms all follow the same development pattern, he claimed this would be evidence for common ancestry. He brought up the existence of &#8220;pharyngeal slits&#8221; (&#8220;gill slits&#8221;), and the presence of &#8220;aortic arches&#8221; (supposedly harking back to our fish ancestors) in the human embryo. He mentioned the fossil record of hominids, but ran out of time before saying any more on the subject.

Carter&#8217;s rebuttal to Pierson&#8217;s opening statement

Carter rebutted each of Pierson&#8217;s main arguments. He pointed out that the 99% similarity between human and chimp is an outdated idea (see &#8220;What about the similarities between monkey and human DNA?&#8221; in Genetics Questions and Answers). He then said that the idea of pseudogenes1 is also outdated. He discussed &#8220;Haldane&#8217;s Dilemma&#8221; (see Haldane s dilemma has not been solved) and how it led to the idea of &#8220;Junk DNA&#8221; (see Vestigial Organs Questions and Answers), but that the ENCODE Project (see Astonishing DNA complexity update) demolished the idea of Junk DNA and makes Haldane&#8217;s Dilemma much worse. ENCODE showed that 99% of the human genome is functional, contrary to old expectations (based on Junk DNA theory) that 97% was non-functional, with massive amounts of RNA transcription occurring along those supposedly junk sections of DNA. He went on to point out that the pseudogene argument is akin to the old vestigial organ argument ( Vestigial Organs Questions and Answers). As an argument from silence (one of the classic logical fallacies), it is not a sound scientific statement.

Carter went on to discuss the fusion hypothesis for the origin of human chromosome 2. There is a diversity of opinion within the creationist community about whether or not it actually happened, yet it proves nothing for the evolutionist. They claim common descent because one of our chromosomes looks like two of the ape chromosomes. But they would also claim common descent if we had the exact same number of chromosomes. Also, if it were true, it was a near-extinction event for humanity.

On the subject of homology, Carter brought up Haeckel&#8217;s fraudulent drawings (see Ernst Haeckel: Evangelist for evolution and apostle of deceit) and compared them to those from a recent paper.2,3 He claimed that an embryologist can tell the difference between the various organisms, but a lot depends on the developmental state of the embryos. He asserted that there are no gill slits in the human embryo and that the pharyngeal pouches perform no respiratory function. He showed that the &#8220;post-anal tail&#8221; is a misnomer and why it only appears to be a tail, as embryologists know (see Embryonic Recapitulation and Similarities Questions and Answers).

Carter broached the subject of hominids, but only managed to state they were fully human (see Anthropology and Apemen Questions and Answers) before time ran out.

Pierson rebuttal to Carter&#8217;s opening statement

Pierson dismissed Carter&#8217;s statement that the evolutionist must believe in miracles, specifically the two Carter brought up. He claimed the big bang was not evolution, neither was the abiogenic origin of life. He claimed that evolution was simply &quot;biological change&quot; [Carter warned about this deceptive definition in his opening comments].

Regarding the origin of life, Pierson claimed the RNA World hypothesis is a suitable explanation, using RNA&#8217;s centrality to life and the experimental formation of ribosome ligases from random chemical reactions to back up his claim. He also correctly outlined the difference between spontaneous generation and abiogenesis [see Origin of Life Questions and Answers].

He went on to discuss the big bang, giving a list of evidences, including the expanding universe, red shifts, experimental verification of the predicted amounts of helium and hydrogen in the universe, and the presence of the cosmic microwave background radiation (CMB). He also introduced the inflationary big bang model, claiming it solved three problems in the older model. [Apparently, he is unaware of the groundbreaking work by two CMI scientists on the subject. See &#8220;What are some of the problems with the &#8216;big bang&#8217; hypothesis?&#8221; in Astronomy and Astrophysics Questions and Answers.]

He then attempted to discount Carter&#8217;s analogy of computer parts randomly coming together to form a functional computer, claiming that a single replicating molecule can be both hardware and software. [For a detailed rebuttal, see Information Theory Questions and Answers.]

In an attempt to discredit Christianity, he briefly brought up ethics, referring to the Old Testament and mentioning mass murder before running out of time.

Cross-examination questions

1. Carter: &#8220;Can you please give the audience a reasonable process whereby random natural events can produce the necessary but non-random information needed for evolution, including the evolution of new genes and new, complex biochemical pathways?&#8221;

Pierson&#8217;s answer to the origin of information dealt with gene duplication (giving trichromatic vision in old world monkeys as an example), and exon shuffling (e.g., serine proteases in the blood coagulation pathway). He claimed that a complex biological process like the production of the bacterial flagellum can be explained easily since the 24 core proteins are almost all slightly modified duplicates of one another (see Germ&#8217;s miniature motor has a clutch).

2. Pierson: He started his question with the claim that Carter stated in a talk the previous evening that there were no transitional fossils between reptiles and mammals or between fish and amphibians. He referenced a book that mentioned the Tiktaalik fossil and other similar species, then asked why Carter did not mention them during his talk?

Carter replied that he did not say there were &#8220;no&#8221; transitional fossils [The majority of the audience had not been present at his talk the previous evening so there was no way to verify either statement. There was also no way to see the context in which this statement was made.]. He used Pakicetus, once thought to be perfectly intermediate between land and aquatic mammals, as an example and showed how it was determined later to have no aquatic features when more of the skeleton was discovered. He showed a slide that illustrated the diversity within the ceratopsid dinosaur group and compared it to the fossil record of the rhinoceros. He showed another slide of an evolutionary tree and claimed it lacked the predicted links, that the ones used today were not used earlier, and that the transitional status of claimed intermediates is always debatable. Darwin wanted innumerable transitional fossils, but we only find a few and there is no evidence to fill in the largest gaps, which are represented by the fewest number of fossils. This is contrary to Darwin and contrary to evolutionary predictions. He finished by using the various families of stony coral families as an additional example of species stasis. (see &#8220;Are there really missing links?&#8221; in Fossils Questions and Answers) (For specific details about Tiktaalik and other similar species, see Panderichthys a fish with fingers? and the list of related article links on the bottom of that page.)

3. Carter: After referencing the ENCODE Project and saying it kills the idea of junk DNA, he asked, &#8220;With only 100,000 generations since the human-chimp split, would you please explain one of the following: a) The fixation of the 5 million indel mutations covering 90 million base pairs that separate our species; b) the fixation of the 35 million point mutations; or c) the fixation of the approximately 1,000 beneficial mutations that might have accumulated in our two separate lineages since we diverged?&#8221; He stated these were intractable problems and that there has not been enough time in evolutionary history to account for all these changes.

Pierson struggled giving his answer. He started by claiming only 5&#8211;8% of the genome is under functional evolutionary constraint [This comes from a separate paper4 that compared the human, mouse, and rat genomes and is not part of the ENCODE results. It is also based on evolutionary assumptions of common descent and the idea that long stretches of DNA shared by diverse organisms must be functionally constrained.], and that the rest can accumulate mutations at will. He brought up the results of another study in which megabases were removed from the mouse genome, with no ill effects to the mice [This was a tremendous surprise to evolutionists,5,6 who expected strongly conserved sequences to be highly functional.]. He stated only 1.5&#8211;2% of the human genome is protein coding, and that most of the genome is composed of repetitive sequences [this avoided the question that had been asked]. In saying this he showed that he was not up to date with the science (e.g., the ENCODE project outlined earlier by Carter).

In an attempt to rebut the &#8216;fixation of mutations&#8217; question, he mentioned Haldane&#8217;s Dilemma and claimed it was flawed. He also mentioned James [sic, Walter] ReMine&#8217;s book, The Biotic Message, and claimed the author made invalid assumptions [see Carter&#8217;s rebuttal below]. Thus, he claimed, Haldane&#8217;s Dilemma is no longer a problem.

As far as the 35 million point mutations that have been fixed in the few generations since humans and chimps split, he said, &#8220;You can&#8217;t answer everything,&#8221; then noted that his lack of an answer looks bad. [Note: this is the crux of Haldane&#8217;s Dilemma and he did not have the ability to address the details.]

To address the assumed fixation of 1,000 beneficial mutations, he said something about different selective pressures in different environments, then split the 1,000 figure to include only 500 per lineage. [The careful observer would have noted he was only asked to answer one of the three possibilities and that he did not answer any!]

4. Pierson: This next question rambled a little before he got to the point. He started off with the claim that CMI takes Genesis as a science book instead of allegory [this is not an accurate portrayal of our position, see What we believe and The Bible and hermeneutics]. He then said that Genesis makes only a few predictions, including that man is made from the dust of the earth, which would mean man is made of silicon dioxide, and that men should have one less rib than woman, which he (correctly) claimed is also false, and that if we are created in God&#8217;s image, and if you take this literally, then God looks like us, &#8220;with two eyes, and all this kind of stuff.&#8221; For his question, he asked why the two predictions the Bible makes about humans are not correct, and if Carter didn&#8217;t think God looks like us, &#8220;anthropomorphic, with eyes, legs, and that kind of stuff,&#8221; then isn&#8217;t he &#8220;interpreting that part where it says we are created in God&#8217;s image, instead of taking it literally?&#8221;

Carter dismissed part of this as not being germane to the debate. Whether or not man is made in the image of God is philosophy and does not pertain to the debate at hand. Also, the definition of the word &#8220;image&#8221; is a religious question (see Made in the image of God).

He disagreed that the Bible taught men should have one less rib than women. He said even primitive man knew that if you lost your finger, your future children would still be born with 10 fingers [thus, Pierson&#8217;s assertion is actually a straw man argument, one of the classic logical fallacies]. He also pointed out that the rib is the only bone that will re-grow when removed (see Regenerating ribs).

He went on to give additional examples of predictions that can be derived from Genesis that Pierson failed to mention. He discussed the evolutionary tale of human migration out of Africa, as tracked by mitochondrial DNA, and how this depends on the assumptions underlying the neutral model of evolution. He then claimed the data better support the Tower of Babel story, with a single migration of humanity, out of the Middle East, traveling in small people groups, from a small original population, into uninhabited territory, in the recent past. He said the three main mitochondrial lineages found throughout the world might very well correlate to the three daughters-in-law of Noah, and that the geographic localization of Y chromosome haplotypes does not make much sense evolutionarily, but easily reflects the fact that there was only one Y chromosome on the Ark.

5. Carter: This question dealt with a recent paper that claimed the earliest fossil apes walked upright and that modern apes lost the ability. The claim that apes walked upright 21 million years ago disagrees with 150 years of evolutionary storytelling and &#8220;throws it in the trash.&#8221; His question: &#8220;Please give the audience, based on this new information, a plausible story on the origin of humans.&#8221;

Pierson did not see why this was a problem, for it did not counter the pseudogene argument, nor did it deal with embryonic similarities. He dismissed the question by stating that walking is a behavior and that the question does not deal with DNA similarities. He said we already knew that Australopithecus afarensis walked upright 3&#8211;4 million years ago and that science progresses, sometimes making mistakes, and that just because something is published does not make it a fact. He saw no reason why pushing back walking a few million years was a problem and then noted that Carter does not believe in the dates to which he was referring anyway. [We suppose he failed to realize that the reputed date of upright walking is before the human-chimp divergence and, if it turns out to be true, the whole evolutionary story of the origin of man is up in the air.]

6. Pierson: The final question was another rambling statement and returned to the pseudogene argument. Pierson claimed that pseudogenes can no longer perform their original function [note that this is an assumption], due to frame shift mutations, premature stop codons, and initiator sequence mutations. He claimed Carter believed all pseudogenes to be functional [he doesn&#8217;t], and that, if so, we have 20,000 pseudogenes that are functional in our genome that arose by mutation. He claimed this destroys all arguments based on information and that it is easy to add information to a genome by just throwing mutations at it. He also claimed this destroys all arguments about specified complexity, all arguments about information, because information is &#8220;trivially easy&#8221;, and also the &#8220;isolated great gap between functional DNA sequences&#8221;.

To address the pseudogene idea, Carter stated the old adage that &#8220;absence of evidence is not evidence of absence&#8221; and that just because we do not know the function does not mean they do not have a function. He said pseudogenes possibly have many functions, including the ability to translate (sic, &#8220;be transcribed into&#8221;) RNA that can then bind to the gene in the target area and repress it. He gave an example of a supposed pseudogene that humans, chimps and gorillas share. He then mentioned a common rule of thumb in biology that &#8220;form follows function&#8221;. If pseudogenes have a form, he argued, and this form is consistent among lineages that supposedly diverged 10 million years ago, the sequence in question certainly has a function (see Potentially decisive evidence against pseudogene shared mistakes ).

Regarding the results of the ENCODE project that Pierson attempted to dismiss, Carter said the purpose of ENCODE was to test how much of the genome was functional and that they found massive amounts of transcription all over the genome. To make matters even worse, they found many places with overlapping RNA codes that do completely different things. The conclusion was that the bulk of what was considered to be junk DNA is now known not just to be functional, but polyfunctional. He stated that evolution and natural selection cannot handle polyfunctionality and that the ENCODE Project answers the question of pseudogenes.

Pierson closing statement

Pierson returned to his main argument that pseudogenes prove the evolutionary origin of man. He said that transcription does not equate to biological function and that you can&#8217;t say something has a function until you find it [This violates the general rule of biology that form follows function, as mentioned earlier]. He said the ENCODE Project found that a majority of DNA is transcribed, not that it has a function [We feel that this begs the question]. That only 5&#8211;8% of the genome is under functional constraint. He then backed up Carter&#8217;s claim that the pseudogene shared between human, chimp, and gorilla discussed earlier may be an example of functional constraint, but that it is only one example and the rest of the genome is free to accumulate mutation since it is mostly repetitious. He claimed this refutes the specified complexity argument and the information argument, because these repetitious sequences have no meaning. He restated his belief that it is easy to add information to a genome.

He went on to discuss the efficiency of information storage in the DNA molecule and that human technology can store data more compactly and that computers can copy information much faster than the cell. Even though DNA is amazing, he claimed, &#8220;to put it up on a pedestal above what humans can do is not accurate.&#8221; [Note: nobody was arguing this point, but humans have achieved this information storage technology using intelligence; evolutionists claim that the incredible information storage system of DNA came about without any intelligence.]

Returning once again to the pseudogene argument, he claimed shared mutations are a basic prediction of evolutionary theory [true], and that the opposing view that CMI presents is independent origins [not true, he is assuming no design and no function for pseudogenes]. He then went into a probability argument before bringing up a few additional examples, including an apparent shared mutation that prevents humans and great apes from synthesizing vitamin C (see Why the shared mutations in the Hominidae exon X GULO pseudogene are not evidence for common descent).

He took exception to Carter&#8217;s claim that the 96&#8211;99% identity value between chimps and humans is invalid and cited several measures CMI has reported on their website [but much of what he said was the point CMI was trying to make in the article] (see Human/chimp DNA similarity, >98% Chimp/human DNA similarity? Not any more, and Decoding the dogma of DNA similarity).

He wrapped up his closing statement by attempting to restate several things Carter said in the debate and in the prior evening&#8217;s talk, asking if evolution is racist and whether or not you are immoral if you believe it [see comments by Carter below]. He ended by saying that evolution simply describes the world; it says what is, not what should be.

Carter&#8217;s closing statement

Carter began by tying up some loose ends. He returned to the ENCODE argument, saying that the fact that so much RNA transcription occurs is an indication of function, for evolution would predict some degree of efficiency and it would make much more sense if the useless transcription eventually got turned off. He agreed that the genome has much repetitive material, but that does not mean it is useless info. In fact, he said, repeats are probably structural.

He agreed that the human and chimp genomes are similar, but that there are many portions we cannot align and there are many genes we do not share. These genes cannot be explained by evolution because there is not enough time to evolve them. Getting back to duplication as an engine for evolution, he said duplicated genes should tend to be weeded out over time. They should be destroyed if they have no function. He also brought up the existence of evolutionary modeling programs (see From ape to man via genetic meltdown: a theory in crisis) and concluded that our genomes are doomed to extinction due to the accumulation of deleterious mutations that natural selection cannot weed out.








Carter: &#8216;there is nothing to say that raping and pillaging is wrong, for there is no higher power in evolutionary theory to whom one can appeal&#8217;





To address Pierson&#8217;s claim that Carter said evolution is immoral, he countered by saying that there is no judge between right and wrong in evolutionary theory and so the proper term is amoral. He raised the example of Genghis Khan, an evolutionary success story by all measures since he is the ancestor to 1 out of every 200 people alive today7 , and pointed out that there is nothing to say that raping and pillaging is wrong, for there is no higher power in evolutionary theory to whom one can appeal (see Morality and Ethics Questions and Answers).

He claimed Haldane&#8217;s dilemma was not the consequence of errors in his work on this, which work was actually used to generate the idea of junk DNA. He also said he had had personal conversation with Walter ReMine, who explained to him what his detractors have said about his work is wrong (see The Biotic Message: Evolution versus Message Theory).

He noted that Pierson did not bring up the creationist position often [and it seems he misstated it when he did] and how he (Carter) tried to stay on the evolutionists&#8217; turf to make a point. He pointed out how Pierson tried to stay away from origins, how Darwinian evolution fails mathematically, and then discussed the fact that evolution is really a smokescreen for the worldview battle that is raging behind it. In fact, Darwin and his friends rejoiced that they were attacking Christianity. He then gave his personal testimony about how creationist analysis helped him keep his faith while in his undergraduate training.

Briefly, he addressed the big bang, claiming inflation theory is a magic wand and that it defies all known laws of physics. He wrapped this up with this quote, &#8220;If you have to resort to unknown mechanisms to explain the most important part of your theory, one wonders how solid your theory is in the first place.&#8221;

Regarding human evolution [the topic of the debate], he stated his belief that we came from recent origins, and gave a list of evidences from genetics.

In his final few minutes, Carter encouraged the audience to dig deeper into the subject and asked them not to turn away from God if they heard him calling.

Related articles
Splicing and dicing the human genome: Scientists begin to unravel the splicing code
CMI scientific blunder?










Related resources


Dothan Creation/Evolution Debate DVD, The Great


The proponents of evolutionary theory are no longer able to dismiss creationists with a wave of their academic hand. There is a growing list of scientists who believe in Genesis creation that refutes the often repeated and unsubstantiated claim that &#8216;virtually every scientist in the world believes the theory to be true&#8217; and that a person who does not believe in evolution cannot be a &#8216;legitimate scientist&#8217;. This public debate, between skeptic and vocal public anticreationist Mr Rick Pierson and Dr Robert Carter, a scientist with Creation Ministries International in Atlanta, Georgia, took place in front of a packed gallery in Dothan, Alabama. (Junior High&#8211;Adult) 103 min.


References
1.Type pseudogenes into the search engine window on this site (creation.com). Return to text.
2.Pennisi, E. 1997. Haeckel&#8217;s embryos: fraud rediscovered. Science 277(5331):1435. Return to text.
3.Richardson et al. 1997. There is no highly conserved embryonic stage in the vertebrates: implications for current theories of evolution and development. Anat. Embryol. 196:91&#8211;106. Return to text.
4.Cooper, G.M. 2004. Characterization of evolutionary rates and constraints in three mammalian genomes, Genome Res. 14:539&#8211;548. Return to text.
5.Ahituv N., et al. 2007. Deletion of ultraconserved elements yields viable mice. PLoS Biology 5(9):1906&#8211;1911. Return to text.
6.Gross, L. 2007. Are &#8220;Ultraconserved&#8221; genetic elements really indispensable? PLoS Biology 5(9):1839. Return to text.
7.Zerjal, T., et al. 2003. The genetic legacy of the mongols. Am. J. Hum. Genet. 72:717&#8211;721. See also &#8220;Genghis Khan a Prolific Lover, DNA Data Implies&#8221; at Genghis Khan a Prolific Lover, DNA Data Implies. Return to text.


The Great Dothan Debate

Oh. Is this more creationist copy pasta? Thought you didnt do that. Remember. Summarize and provide the link. Thats all you gotta do.
 
The traditional understanding of DNA has recently been transformed beyond recognition. DNA does not, as
we thought, carry a linear, one-dimensional, one-way, sequential code—like the lines of letters and words on
this page. And the 97% in humans that does not carry protein-coding genes is not, as many people thought,
fossilized ‘junk’ left over from our evolutionary ancestors. DNA information is overlapping-multi-layered and
multi-dimensional; it reads both backwards and forwards; and the ‘junk’ is far more functional than the protein
code, so there is no fossilized history of evolution. No human engineer has ever even imagined, let alone
designed an information storage device anything like it. Moreover, the vast majority of its content is metainformation—
information about how to use information. Meta-information cannot arise by chance because
it only makes sense in context of the information it relates to. Finally, 95% of its functional information shows
no sign of having been naturally selected; on the contrary, it is rapidly degenerating! That means Darwin was
wrong—natural selection of natural variation does not explain the variety of life on Earth.


I would like to address that part specifically. For one, we know exactly how DNA stores information. Sequences of bases code for an amino acid chain. Your article correctly identifies start and stop codons, congrats. Your article is extremely misleading. It points of facts, its correct a lot of the time, but its misleading you the entire time. You claim that 98% of the human genome is &quot;junk DNA&quot;, and your wrong. The main argument is that scientists refer to this as junk DNA, but its not. The correct term is noncoding DNA, because they arent junk. They control expression of genes, include ribosomal RNA, and sometimes are several duplicates of the same gene. Metainformation, you would call it. Your article correctly states that too. The fact that 98% of RNA doesnt directly code protein has nothing to do with intelligent design. But the really interesting part is the last part. How exactly is human DNA, or any organisms DNA, &quot;deteriorating&quot;? Furthermore, explain what exactly you mean by deteriorating. Will human kind degenerate into a mutated mess? There is no such thing as DNA &quot;deterioration&quot;. An entire population does not simply genetically deteriorate.

If a gene pool gets to small it most certainly can happen. I don't think you realize how damaging mutations can be.

I dont think you realize anything your talking about? I think i understand pretty well how damaging mutations can be. Mutations dont make gene pools smaller. Doesnt happen.
 
The traditional understanding of DNA has recently been transformed beyond recognition. DNA does not, as
we thought, carry a linear, one-dimensional, one-way, sequential code&#8212;like the lines of letters and words on
this page. And the 97% in humans that does not carry protein-coding genes is not, as many people thought,
fossilized &#8216;junk&#8217; left over from our evolutionary ancestors. DNA information is overlapping-multi-layered and
multi-dimensional; it reads both backwards and forwards; and the &#8216;junk&#8217; is far more functional than the protein
code, so there is no fossilized history of evolution. No human engineer has ever even imagined, let alone
designed an information storage device anything like it. Moreover, the vast majority of its content is metainformation&#8212;
information about how to use information. Meta-information cannot arise by chance because
it only makes sense in context of the information it relates to. Finally, 95% of its functional information shows
no sign of having been naturally selected; on the contrary, it is rapidly degenerating! That means Darwin was
wrong&#8212;natural selection of natural variation does not explain the variety of life on Earth.


I would like to address that part specifically. For one, we know exactly how DNA stores information. Sequences of bases code for an amino acid chain. Your article correctly identifies start and stop codons, congrats. Your article is extremely misleading. It points of facts, its correct a lot of the time, but its misleading you the entire time. You claim that 98% of the human genome is &quot;junk DNA&quot;, and your wrong. The main argument is that scientists refer to this as junk DNA, but its not. The correct term is noncoding DNA, because they arent junk. They control expression of genes, include ribosomal RNA, and sometimes are several duplicates of the same gene. Metainformation, you would call it. Your article correctly states that too. The fact that 98% of RNA doesnt directly code protein has nothing to do with intelligent design. But the really interesting part is the last part. How exactly is human DNA, or any organisms DNA, &quot;deteriorating&quot;? Furthermore, explain what exactly you mean by deteriorating. Will human kind degenerate into a mutated mess? There is no such thing as DNA &quot;deterioration&quot;. An entire population does not simply genetically deteriorate.

The mutation rate we see at this time matches the bibles claim man has been on the earth between 6,000 to 10,000 years.

If mutations continued at the current rate for each generation that has been observed,it means it would have taken man 6 billion years to diverge. Evolutionist can't seem to pin it down when it happened but they say they can trace us back to a certain period through our DNA. Would they not be able to pinpoint when we diverged from ape like creatures ? so they say we diverged somewhere between 5 to 20 million years ago. It's not possible as has been shown.

If we mutated by the rate the evolutionist need to show man diverged from our nearest ancestor 5 to 20 million years ago, we would go extinct in just 100 thousand years. By the effects mutations have on any organism. That is just to many mutations for any orgism to survive.

It was a stretch of the imagination and even more of a stretch when we do the math. The more we learn about the information contained in our chromosomes the harder it is for Neo to stand up to scrutiny.

I have so many things to say about this its incredible. For one, wheres your source? What do you define as mutation rate? Are you including phenomena like gene flow and genetic drift?

Or are you doing a very simple calculation assuming one organism at one time receiving one mutation at a time. Mutations are passed along generations by the thousands. I would love to know how you think mutations dont add enough information for humans to form. Where did you get your 6 billion year figure from? creationist websites?

Hows this:
drugresistan.jpg


Thats a single cancer cell with about 20 extra chromosome and billions of extra bases. Tell me again how mutations cant add base pairs to DNA.
 
I would like to address that part specifically. For one, we know exactly how DNA stores information. Sequences of bases code for an amino acid chain. Your article correctly identifies start and stop codons, congrats. Your article is extremely misleading. It points of facts, its correct a lot of the time, but its misleading you the entire time. You claim that 98% of the human genome is &quot;junk DNA&quot;, and your wrong. The main argument is that scientists refer to this as junk DNA, but its not. The correct term is noncoding DNA, because they arent junk. They control expression of genes, include ribosomal RNA, and sometimes are several duplicates of the same gene. Metainformation, you would call it. Your article correctly states that too. The fact that 98% of RNA doesnt directly code protein has nothing to do with intelligent design. But the really interesting part is the last part. How exactly is human DNA, or any organisms DNA, &quot;deteriorating&quot;? Furthermore, explain what exactly you mean by deteriorating. Will human kind degenerate into a mutated mess? There is no such thing as DNA &quot;deterioration&quot;. An entire population does not simply genetically deteriorate.

The mutation rate we see at this time matches the bibles claim man has been on the earth between 6,000 to 10,000 years.

If mutations continued at the current rate for each generation that has been observed,it means it would have taken man 6 billion years to diverge. Evolutionist can't seem to pin it down when it happened but they say they can trace us back to a certain period through our DNA. Would they not be able to pinpoint when we diverged from ape like creatures ? so they say we diverged somewhere between 5 to 20 million years ago. It's not possible as has been shown.

If we mutated by the rate the evolutionist need to show man diverged from our nearest ancestor 5 to 20 million years ago, we would go extinct in just 100 thousand years. By the effects mutations have on any organism. That is just to many mutations for any orgism to survive.

It was a stretch of the imagination and even more of a stretch when we do the math. The more we learn about the information contained in our chromosomes the harder it is for Neo to stand up to scrutiny.

I have so many things to say about this its incredible. For one, wheres your source? What do you define as mutation rate? Are you including phenomena like gene flow and genetic drift?

Or are you doing a very simple calculation assuming one organism at one time receiving one mutation at a time. Mutations are passed along generations by the thousands. I would love to know how you think mutations dont add enough information for humans to form. Where did you get your 6 billion year figure from? creationist websites?

Hows this:
drugresistan.jpg


Thats a single cancer cell with about 20 extra chromosome and billions of extra bases. Tell me again how mutations cant add base pairs to DNA.

I will produce the information that has already been produced. but in the meantime tell me what you think causes mutations ?
 
I would like to address that part specifically. For one, we know exactly how DNA stores information. Sequences of bases code for an amino acid chain. Your article correctly identifies start and stop codons, congrats. Your article is extremely misleading. It points of facts, its correct a lot of the time, but its misleading you the entire time. You claim that 98% of the human genome is &quot;junk DNA&quot;, and your wrong. The main argument is that scientists refer to this as junk DNA, but its not. The correct term is noncoding DNA, because they arent junk. They control expression of genes, include ribosomal RNA, and sometimes are several duplicates of the same gene. Metainformation, you would call it. Your article correctly states that too. The fact that 98% of RNA doesnt directly code protein has nothing to do with intelligent design. But the really interesting part is the last part. How exactly is human DNA, or any organisms DNA, &quot;deteriorating&quot;? Furthermore, explain what exactly you mean by deteriorating. Will human kind degenerate into a mutated mess? There is no such thing as DNA &quot;deterioration&quot;. An entire population does not simply genetically deteriorate.

If a gene pool gets to small it most certainly can happen. I don't think you realize how damaging mutations can be.

I dont think you realize anything your talking about? I think i understand pretty well how damaging mutations can be. Mutations dont make gene pools smaller. Doesnt happen.

Then explain how you have purebred animals ? How come a person knows when they breed two of the same breed what you will get for offspring ? If the gene pool does not get smaller you would only have muts. Now that we know all the information contained in the chromosomes is being used where do you say this new information comes from ?


Look we can get new information but it's from the loss or rerarranging of the information. If we get enough beneficial information you can get change. And we believe there are beneficial mutations but not near enough. Since beneficial mutations are so rare it can't produce what evolutionist claim.

Besides this part of the conversation is pointless, because we don't agree on this issue so i will move back and show you again why it's not possible.
 
Last edited:
Thats a single cancer cell with about 20 extra chromosome and billions of extra bases. Tell me again how mutations cant add base pairs to DNA.

Dr. Spetner - Major Points

1. Adaptive mutations are stimulated by the environment. This is a contradiction of the basic tenants of evolution.
2. Animal embryos develop according to a dual process involving the influence of their genetic program and their environment.
3. Cumulative selection has never been demonstrated, whereby a series of positive mutations, each of which must survive, will lead to a new organism
4. Low mutation rates are a problem for evolution due to a proofreading process that corrects most of the errors in transcription.
5. Information must be added incrementally. New information cannot be accounted for. Losses of information can be accounted for during mutations.


There is too much information to break it down in to a summary to understand the mutation rates,so get your reading glasses on.
 
You need to visit the website to see the graphs.

Attention cbirch.

Mutations: evolution&#8217;s engine becomes evolution&#8217;s end!

by Alex Williams


In neo-Darwinian theory, mutations are uniquely biological events that provide the engine of natural variation for all the diversity of life. However, recent discoveries show that mutation is the purely physical result of the universal mechanical damage that interferes with all molecular machinery. Life&#8217;s error correction, avoidance and repair mechanisms themselves suffer the same damage and decay. The consequence is that all multicellular life on earth is undergoing inexorable genome decay. Mutation rates are so high that they are clearly evident within a single human lifetime, and all individuals suffer, so natural selection is powerless to weed them out. The effects are mostly so small that natural selection cannot &#8216;see&#8217; them anyway, even if it could remove their carriers. Our reproductive cells are not immune, as previously thought, but are just as prone to damage as our body cells. Irrespective of whether creationists or evolutionists do the calculations, somewhere between a few thousand and a few million mutations are enough to drive a human lineage to extinction, and this is likely to occur over a time scale of only tens to hundreds of thousands of years. This is far short of the supposed evolutionary time scales.

--------------------------------------------------------------------------------

Mutations destroy


Photo stock.xchng



Ever since Hugo de Vries discovered mutations in the 1890s they have been given a central role in evolutionary theory. De Vries was so enamoured with mutations that he developed an anti-Darwinian saltationist theory of evolution via mutation alone.1 But as more became known, mutations of large effect were found to be universally lethal, so only mutations of small effect could be credibly considered as of value to evolution, and de Vries&#8217; saltationist theory waned. When the Neo-Darwinian Synthesis emerged in the 1930s and 1940s, mutations were said to provide the natural variations that natural selection worked on to produce all new forms of life.

However, directly contradicting mutation&#8217;s central role in life&#8217;s diversity, we have seen growing experimental evidence that mutations destroy life. In medical circles, mutations are universally regarded as deleterious. They are a fundamental cause of ageing,2,3 cancer4,5 and infectious diseases.6

Even among evolutionary apologists who search for examples of mutations that are beneficial, the best they can do is to cite damaging mutations that have beneficial side effects (e.g. sickle-cell trait,7 a 32-base-pair deletion in a human chromosome that confers HIV resistance to homozygotes and delays AIDS onset in heterozygotes,8 CCR5&#8211;delta32 mutation,9 animal melanism,10 and stickleback pelvic spine suppression11). Such results are not at all surprising in the light of the discovery that DNA undergoes up to a million damage and repair events per cell per day.12

Mutation physics

Neo-Darwinian theory represents mutations as uniquely biological events that constitute the &#8216;engine&#8217; of biological variation. However, now that we can see life working in molecular detail, it becomes obvious that mutations are not uniquely biological events&#8212;they are purely physical events.








All multi-cellular life on earth is undergoing inexorable genome decay because the deleterious mutation rates are so high &#8230; and natural selection is ineffective in removing the damage.





Life works via the constant (often lightning-fast) movement of molecular machinery in cells. Cells are totally filled with solids and liquids&#8212;there are no free spaces. The molecular machines and the cell architecture and internal structures are made up of long-chain organic polymers (e.g. proteins, DNA, RNA, carbohydrates, lipids) while the liquid is mostly water. All forms of movement are subject to the laws of motion, yet the consequences of this simple physical fact have been almost universally ignored in biology.

Newton&#8217;s first law of motion says that a physical body will remain at rest, or continue to move at a constant velocity, unless an external force acts upon it. Think of a message molecule that is sent from one part of a cell to another. Since the cell is full of other molecules, with no empty spaces, the message molecule will soon hit other molecules and either slow down or stop altogether. This is the universal problem known as friction.

Friction events can result from many causes, but can be crudely divided into two types: one is referred to as ploughing and the other is shearing. Ploughing involves the physical displacement of materials to facilitate the motion of an object, while shearing arises from the disruption of adhesive interactions between adjacent surfaces.13

Molecular machines in cells owe a great deal of their structure to hydrogen bonds, but these are rather weak and fairly easily broken. For example, most proteins are long, strongly-bonded chains of amino acids, but these long chains are coiled up into 3-dimensional machine components, and the 3-dimensional structures are held together by hydrogen bonds.14 When such structures suffer mechanical impacts, the transfer of momentum can distort or break the hydrogen bonds and critically damage the molecule&#8217;s function.

The inside of a cell has a density and viscosity somewhat similar to yogurt (figure 1). The stewed fruit (dark colour) added to the yogurt during manufacture can be seen swirling out into the white yogurt. The fruit has not continued to disperse throughout the yogurt. It was completely stopped by the initial friction. This is like what happens in a cell&#8212;any movement is quickly dampened by friction forces of all kinds coming from all directions.




Figure 1. A transparent carton of fruit yogurt illustrates how friction in the viscous fluid stopped the motion initiated by mixing the fruit (dark colour) with the yogurt (white colour).

How do cells cope with this friction? In at least five different ways. First, there are motor proteins available all over the cell that attach to mobile molecules and carry them along the filaments and tubules that make up the cytoskeleton of the cell. Second, these motor proteins are continually re-energized after friction collisions by energy inputs packaged in the form of ATP molecules. Third, there are &#8216;address labels&#8217; attached to mobile molecules to ensure they are delivered to the correct destination (friction effects continually divert mobile molecules from their course). Fourth, thin films of water cover all the molecular components of cells and provide both a protective layer and a lubricant that reduces the frequency and severity of friction collisions. Fifth, there is a wide range of maintenance and repair mechanisms available to repair the damage that friction causes.

The friction problem&#8212;and the damage that results from it&#8212;is orders of magnitude greater in cells than it is in larger mechanical systems. Biomolecules are very spiky objects with extremely rough and highly adhesive surfaces. They cannot be manufactured and honed to the smoothness that we achieve in our vehicle engine components such as pistons and flywheel pivots, nor can ball-bearings be inserted to reduce the surface contact area, such as we do in wheel axles. As a biological example, consider the rotary motor that drives the bacterial flagellum. The major wear surfaces are on the rotor (attached to the flagellum) and the stator (the housing for the rotor, attached to the cell wall). The stator consists of 22 molecules, set in 11 pairs. The wear rate is so great that the average residence time for a stator molecule in the stator is only about 30 seconds.15 The cell&#8217;s maintenance system keeps a pool of about 200 stator molecules in reserve to cope with this huge turnover rate.

Finding suitable lubricants to overcome friction is a major focus in the nanotechnology industry. A special technique called &#8216;friction force microscopy&#8217; has been developed to quantitatively evaluate potential lubricants.16

This shows that the laws of physics, operating among the viscous components of the cell, both predict and explain the high rate of molecular damage that we observe in DNA. Between 50% and 80% of the DNA in a cell is continually consulted for the information necessary for everyday metabolism. This consultation requires numerous steps that each involve physical deformation of the DNA&#8212;moving around within the nucleus, winding and unwinding of the chromatin structures, unzipping the double-helix, binding and unbinding of the transcription machinery, re-zipping the double-helix, rewinding the chromatin structures and shuffling around within the nucleus. Each step of motion is powered by ATP discharges and inevitably causes mechanical damage among the components. While most of this damage is repaired, the repair mechanisms are not 100% perfect because they suffer mechanical damage themselves.17

Mutations rapidly destroy

Within neo-Darwinian theory, natural selection is supposed to be the guardian of our genomes because it weeds out unwanted deleterious mutations and favours beneficial ones. Not so, according to genetics expert Professor John Sanford.18 Natural selection can only weed out mutations that have a significant negative effect upon fitness (number of offspring produced). But such &#8216;fitness&#8217; is affected by a huge variety of factors, and the vast majority of mutations have too small an effect for natural selection to be able to detect and remove them.

Furthermore, if the average mutation rate per person per generation is around 1 or more, then everyone is a mutant and no amount of selection can stop degeneration of the whole population. As it turns out, the mutation rate in the human population is very much greater than 1. Sanford estimates at least 100, probably about 300, and possibly more.

All multicellular life suffers

Two recent reviews of the mutation literature not only confirm Sanford&#8217;s claims, but extend them to all multi-cellular life.

In a review of the distribution of fitness effects (DFE) of mutations,19 the authors are unable to give any examples of beneficial mutations for humans. In their calculations regarding the rate of deleterious mutations (MD) and neutral mutations (MN), they use the equalities MD = 1 &#8211; MN and MN = 1 &#8211; MD which both imply that the rate of beneficial mutations is zero. They do give a few non-zero values for beneficial mutation rates in some experimental organisms, but qualify these results by noting the interference of other variables.

In a review of mutation rate variations in eukaryotes,20 the authors admit that all multicellular organisms are undergoing inexorable genome decay from mutations because natural selection cannot remove the damage.21 Their Box 2 and Table 1 list deleterious mutation rates for a wide range of multicellular organisms, noting they are all underestimates, with the possible exception of those for the fruit fly Drosophila melanogaster with a value of 1.2. The value given for humans is &#8216;~3&#8217;.

Thus, all multicellular life on earth is undergoing inexorable genome decay because the deleterious mutation rates are so high, the effects of the most individual mutations are so small, there are no compensatory beneficial mutations, and natural selection is ineffective in removing the damage.

The wheels have come off the neo-Darwinian juggernaut!

How long to extinction?

How long could multicellular life survive in the face of universal genetic degradation? This is a very important question, and I will attempt to answer it by using several different lines of evidence.

Human ageing and cancer

We have recently discovered that there is a common biology in cancer and ageing&#8212;both are the result of accumulating molecular damage in cells.22 This confirms the arguments outlined above, that for purely physical reasons molecular machinery suffers extremely high damage rates, clearly evident within the lifespan of a single human. Every cell has a built-in time clock to limit this damage and minimize the chance of it becoming cancerous. At every cell division, each telomere (the caps on both ends of a chromosome that stop the double-helix from unravelling) is shortened by a small amount, until they reach the Hayflick Limit&#8212;discovered in 1965 to be a little over 50 cell divisions. The cells then stop dividing and they are dismantled and their parts are recycled.

By adding the enzyme telomerase, the telomere shortening problem can be circumvented, but that then exposes the cell to a greater risk of becoming cancerous because of accumulating damage elsewhere in the cell. The overall balance between protection from damage and the need for longevity determines fitness (reproductive success) and life span.23 The body&#8217;s normal reaction to increasing genome damage is to kill off the damaged cells via programmed senescence (of which the telomere clock with its Hayflick limit is but one part). But cells become malignant (cancerous) when mutation disables the senescence mechanism itself, which then enables the damaged cells to proliferate without limit.22 The Hayflick limit of around 50 cell divisions for humans seems to provide the optimum balance.

Fifty human generations of 20 years each gives us only 1,000 years as a timescale over which a human lineage would begin to experience a significant mutation load in its genome. This is alarmingly rapid compared with the supposed evolutionary time scale of millions and billions of years.

Reproductive cells




Figure 2. Schematic representation of human life expectancy (&#8212;), male fertility (&#8729;&#8729;&#8729;), and risk of fetal abnormality with mother&#8217;s age (---). Despite the protective Hayflick limit on cell divisions and life expectancy, very significant molecular damage accumulates in humans even during the most productive years of life. Mutations do even more damage than the Hayflick limit and associated cancer rates suggest.

Ever since August Weismann published The Germ-Plasm: A Theory of Heredity24 in 1893, a discrete separation has been shown to exist between body cells (the soma) and germ-line cells (germplasm). Germ-line cells were thought to be more protected from mutation than other body cells. However, another recently discovered cause of ageing is that our stem cells grow old as a result of heritable DNA damage and degeneration of their supporting niches (the special &#8216;nest&#8217; areas in most organs and tissues of the body where stem cells grow and are nurtured and protected). The telomere shortening mechanism&#8212;intended to reduce cancer incidence&#8212;appears to also induce the unwanted side-effect of a decline in the replicative capacity of certain stem-cell types with advancing age. This decreased regenerative capacity has led to a &#8216;stem-cell hypothesis&#8217; for human age-associated degenerative conditions.25

Human fertility problems suggest that the decline in niche protection of stem cells also applies to our gametes (eggs and sperm). For males, fertility&#8212;as measured by sperm count, sperm vigor and chance of conception&#8212;begins to decline significantly by age 40 and the rate of certain paternal-associated birth defects increases rapidly during the 30s (figure 2).26 For females, the chance of birth defects increases rapidly from around the mid-30s, particularly because of chromosome abnormalities (figure 2). In the middle of the most productive part of our lives, our bodies are therefore showing clear evidence of decline through accumulation of molecular damage in our genomes.

Do germ-line cells really suffer less damage?

When DNA was discovered to be the carrier of inheritance, Weissman&#8217;s germ-plasm theory gave rise to the &#8216;immortal strand hypothesis.&#8217; When the DNA of an embryonic stem cell replicates itself, it was thought that the &#8216;old&#8217; strand would remain with the self-renewing &#8216;mother&#8217; stem cell, while the newly constructed daughter strand proceeds down the path of differentiation into a body cell. In this way, the &#8216;old&#8217; strand would remain error free&#8212;because it has not suffered any copying errors&#8212;and thus becomes effectively immortal.

However, a research team at the Howard Hughes Memorial Institute recently tested this theory using the stem cells that produce blood, and found that they segregate their chromosomes randomly.27 That is, the &#8216;immortal strand hypothesis&#8217; is wrong. If stem cells are not given this kind of preferential treatment then it is reasonable to conclude that germ-line cells are also subject to the same molecular damage as somatic cells. This is confirmed by the observation that human fertility exhibits damage long before age-related diseases take over.

A single human lifetime is enough to show very significant mutation damage, even in our reproductive cells.

Haldane&#8217;s dilemma

The severe contradictions that these findings pose for neo-Darwinian theory corroborate what has become known as Haldane&#8217;s dilemma. J.B.S. Haldane was one of the architects of neo-Darwinism who pioneered its application to population biology. He realized that it would take a long time for natural selection to fix an advantageous mutation in a population&#8212;fixation is when every member has two copies of an allele, having inherited it from both mother and father. He estimated that for vertebrates, about 300 generations would be required, on average, where the selective advantage is 10%. In humans, with a 20-year generation time and about 6 million years since our last common ancestor with the chimpanzee, only about 1,000 such advantageous mutations could have been fixed. Haldane believed that substitution of about 1,000 alleles would be enough to create a new species, but it is not nearly enough to explain the observed differences between us and our closest supposed relatives.

The measured difference between the human and chimpanzee genomes amounts to about 125 million nucleotides, which are thought to have arisen from about 40 million mutation events.28 If only 1000 of these mutations could have been naturally selected to produce the new (human) species, it means the other 39,999,000 mutations were deleterious, which is completely consistent with the reviews showing that the vast majority of mutations are deleterious. Consequently, we must have degenerated from the apes, which is an absurd conclusion.

According to Kirschner and Gerhart&#8217;s facilitated variation theory,29 life consists of two main components&#8212;conserved core processes (the structure and machinery in cells) and modular regulatory processes (the signalling circuits and switches that operate the machinery and provide a built-in source of natural variation). The 40 million &#8216;mutation&#8217; differences between humans and chimps are therefore much more reasonably explained as 40 million modular differences between the design of chimps and the design of humans.

Quantitative estimates of time to extinction

There are a number of different ways to estimate the time it would take for relentlessly accumulating mutations to send our species to extinction.

Binomial estimates

Some very rough estimates can be derived from the Binomial distribution, which can predict the likelihood of multiple mutations accumulating in an essential genetic functional module. A binomial model of a mutating genome could consist of the cell&#8217;s DNA being divided into N functional modules, of which Ne are essential; that is, the lineage fails to reproduce if any of the essential modules are disabled. For any given mutational event, p = 1/N is the probability of being &#8216;hit&#8217;, q is the probability of being &#8216;missed&#8217;, and p q = 1.

What is the likely value of N? We can derive two estimates from the knowledge that there are about 25,000 genes, plus the discovery from the pilot study report of the ENCODE project that virtually the whole human genome is functional.30

For the first estimate, the average protein contains a few hundred amino acids and each amino acid requires three nucleotides of code, so the average gene would take up about 1,000 nucleotides of exon space (an exon is the protein-coding part of a gene). There are about 3 billion nucleotides in the whole human genome, so if we assume that the average protein represents an average functional unit then N = 3 million.

The second estimate comes from the ENCODE report that gene regions produce on average 5 RNA transcripts per nucleotide, and the untranslated regions produce on average 7 RNA transcripts per nucleotide. There are about 33 times as many nucleotides in the untranslated regions as in the genic regions. Assuming that transcript size is approximately equal in each region, then there are 25,000 x 5 = 125,000 gene transcripts and 25,000 x 33 x 7 = 5,775,000 untranslated transcripts, making N = 5,900,000 in total. Our two estimates of N are therefore 3 to 6 million in round figures.

What is the likely value of Ne? Experiments with mice indicate that 85% of genes can be knocked out one at a time without lethal effects.31 This is due to the robustness and failure-tolerance through fallback processes built into the genomic designs. That means any one of those remaining 15% genes will be fatal if disabled. Multiple mutations occur however, so the likely value of Ne when exposed to multiple mutations will be much higher than 15%. The maximum possible value is 100%. In a study of 2,823 human metabolic pathways, 96% produced disease conditions when disrupted by mutation,32 so if we take an average between this value and the minimum 15% then we get about 60% of functional units being essential.

How many random mutations are required on average to disable an essential functional module? In rare cases, a single mutation is enough to disable a person&#8217;s ability to reproduce. A two-hit model is common in cancer. In a study of cell signalling networks, these two hits usually knocked out: (i) the programmed death system for dealing with damaged (cancerous) cells, and (ii) the normal controls on cell proliferation&#8212;so the damaged cancer cells can proliferate without limit. The proportion of cancer-associated genes was also found to increase with the number of linkages between genes. When a healthy gene is linked to more than 6 mutated genes, ~80% of all genes in the network are cancerous. Extrapolating from this, we find that by the time a normal gene is linked to about 10 mutated genes, then the whole network has become cancerous.33

Almost 70% of known human genes can be causal agents of cancer when mutated.34 Cancers can result from as little as a single mutation in a stem cell, or multiple mutations in somatic cells.35 The minimum possible value of 1 is known to be rare, so the more common occurrence of the 2-hit model makes it a reasonable best-estimate minimum. But it may require 10 modules to receive two hits each for the whole network to become dysfunctional.

The maximum number of hits required to disable a single module may be 100 or more, but if the average functional module only contains 1,000 nucleotides then this figure, at 10% of the whole, seems rather large. An order-of-magnitude average is perhaps more likely to be 10 random mutations per functional module.

To provide some context for these estimates, recent work shows that the cell-cycle checkpoint damage repair system is activated when 10 to 20 double-strand breaks accumulate in a cell undergoing division.36 That is, life will tolerate only 10 to 20 DNA breaks per cell before it starts repair work, whereas we are examining scenarios in which there are thousands and millions of damage events per cell. Our numbers are clearly up in a region where the cell&#8217;s repair mechanisms are working at their hardest.

What then is the likelihood of accumulating either 2 hits in 10 modules, or 10 hits in one module, in any one of either 15% or 60% of the 3 to 6 million functional modules? The binomial distribution in Microsoft Excel was used to make the following calculations, making the further assumption that the likelihood of the unit being a critical one must exceed 50% for extinction to be more likely than not in the next generation.

Assuming 60% essentiality, only one functional module needs to be disabled for the probability of its essential status to exceed 50%. For the 2-hit model, about 6,000 to 12,000 mutations are required to disable ten of the 3 to 6 million functional modules. For the 10-hit model, 3 to 6 million mutations are required to disable one functional module.

Assuming 15% essentiality, four modules need to be disabled before the probability of at least one of them being essential exceeds 50%. For the 2-hit model, 250,000 to 500,000 mutations are required to disable ten modules with four mutations each among the 3 to 6 million functional modules. For the 10-hit model, 3.7 to 7.5 million mutations are required to disable four functional modules.

If every individual produces 100 new mutations every generation (assuming a generation time of 20 years) and these mutations are spread among 3 to 6 million functional modules across the whole genome, then the average time to extinction is:
1,200 to 2,400 years for the 2-hits in 10 modules model and 60% essentiality
50,000 to 100,000 years for the 2-hits in 10 modules model and 15% essentiality
600,000 to 1,200,000 years for the 10-hit model and 60% essentiality
740,000 to 1,500,000 years for the 10-hit model and 15% essentiality.

Truncation selection

Evolutionary geneticist Dr James Crow argued that humans are probably protected by &#8216;truncation selection&#8217;.26 Truncation occurs when natural selection preferentially deletes individuals with the highest mutation loads. Plant geneticist John Sanford put Crow&#8217;s claims to the test by developing a computer simulation of truncation. His assumptions were: 100 individuals in the population, 100 mutations per person per generation, 4 offspring per female, 25% non-genetic random deaths per generation, and 50% selection against the most mutant offspring per generation. He assumed an average fitness loss per mutation of 1 in 10,000. His species became extinct in only 300 generations. With a generation time of 20 years this corresponds to 6,000 years.37

Sanford&#8217;s assumptions are somewhat unrealistic, but there are other ways to approach the problem. Mutations are pure chance events that follow a Poisson distribution, and this behaves like the normal curve when the average expected value is greater than about 30.38 In a Poisson distribution, the variance is equal to the average expected value, and the standard deviation is the square root of the variance. When the expected average value is 100, the standard deviation will be 10. The normal curve now tells us the following:
Half the people will suffer about 100 mutations or more, and half the people will suffer about 100 mutations or less.
About 84% of people will suffer 110 mutations or less, and so the remaining 16% of people will suffer 110 or more mutations. Alternatively, about 16% of people will suffer 90 or less.
About 97.7% of the population will experience 120 mutations or less, and the remaining 2.3% will suffer 120 mutations or more. Alternatively, 2.3% will suffer 80 or less.
About 99.9% of the population will suffer 130 mutations or less, and the remaining 0.1% will suffer 130 or more mutations. Alternatively, 0.1% will suffer 70 or less.

If we remove the most mutant&#8212;those above 130 mutations per person per generation&#8212;then we will only remove 0.1% of the population and it will make virtually no difference. If we removed the most mutant 50% of the population that would not solve the problem either, for two reasons. First, the great majority of the remaining people still suffer between 70 and 100 mutations per person per generation, far above the value of 1 that ensures inexorable decline. Second, removing half the population each generation would send it extinct in a few dozen generations.




Table 1. Estimated number of generations and years to extinction for populations of various sizes, when fitness declines by 1.5% in each generation.

Synergistic epistasis and population size

None of the above models include the effect of synergistic epistasis (if one gene is mutated, its impact is ameliorated by the coordinated activity of other genes) or of population size. We can include these by using Crow&#8217;s estimate that the fitness of the human race is currently degenerating at a rate of about 1 to 2% per generation. If we use an average value of 1.5% then only 98.5% of the next generation will produce reproductively viable offspring. The next generation after that will only have 98.5% of those survivors able to produce reproductively viable offspring, and so on.

For any given stable population size N, the size of the next generation that can produce reproductively viable offspring will be 98.5% of N , and for any given number of generations G, the number of survivors able to produce reproductively viable offspring will be (98.5%)G of N.

Table 1 shows the approximate numbers of generations after which the population degenerates to extinction (only one individual is left, so breeding cannot continue). No population can sustain a continual loss of viability of 1.5%.








Like rust eating away the steel in a bridge, mutations are eating away our genomes and there is nothing we can do to stop them.





The above model assumes that right from the beginning there will be 1.5% loss of fitness each generation. However, the binomial simulations earlier showed that individuals can tolerate somewhere between a few thousand to a few million mutations before the damage critically interferes with their ability to reproduce. This means that synergistic epistasis is a real phenomenon&#8212;life is robust in the face of mutational assault. Instead of the immediate loss of 1.5% every generation, the general population would remain apparently healthy for a much longer time before the damage became apparent.

However, the rate at which mutations accumulate will remain the same because the cause remains the same&#8212;mechanical damage. This means that most people will be apparently healthy, but then approach the threshold of dysfunction over a much shorter period, creating a population crash rather than a slow decline.

Either way, however, the time scales will be approximately the same because the rate of damage accumulation remains approximately the same.

Summary

Mutations are not uniquely biological events that provide an engine of natural variation for natural selection to work upon and produce all the variety of life. Mutation is the purely physical result of the all-pervading mechanical damage that accompanies all molecular machinery. As a consequence, all multicellular life on earth is undergoing inexorable genome decay because the deleterious mutation rates are so high, the effects of the individual mutations are so small, there are no compensatory beneficial mutations and natural selection is ineffective in removing the damage.

So much damage occurs that it is clearly evident within a single human lifetime. Our reproductive cells are not immune, as previously thought, but are just as prone to mechanical damage as our body cells. Somewhere between a few thousand and a few million mutations are enough to drive a human lineage to extinction, and this is likely to occur over a time scale of only tens to hundreds of thousands of years. This is far short of the supposed evolutionary time scales. Like rust eating away the steel in a bridge, mutations are eating away our genomes and there is nothing we can do to stop them.

Evolution&#8217;s engine, when properly understood, becomes evolution&#8217;s end.


Mutations are evolutions end
 
Last edited:
Cbirch.

Mutations are certainly real. They have profound effects on our lives. And, according to the neo-Darwinian evolutionists, mutations are the raw material for evolution.

But is that possible? Can mutations produce real evolutionary changes? Don&#8217;t make any mistakes here. Mutations are real; they&#8217;re something we observe; they do make changes in traits. But the question remains: do they produce evolutionary changes? Do they really produce new traits? Do they really help to explain that postulated change from molecules to man, or fish to philosopher?

The answer seems to be: &#8220;Mutations, yes. Evolution, no.&#8221; In the last analysis, mutations really don&#8217;t help evolutionary theory at all. There are three major problems or limits (and many minor ones) that prevent scientific extrapolation from mutational change to evolutionary change.

(1) Mathematical challenges. Problem number one is the mathematical. I won&#8217;t dwell on this one, because it&#8217;s written up in many books and widely acknowledged by evolutionists themselves as a serious problem for their theory.

Fortunately, mutations are very rare. They occur on an average of perhaps once in every ten million duplications of a DNA molecule (107, a one followed by seven zeroes). That&#8217;s fairly rare. On the other hand, it&#8217;s not that rare. Our bodies contain nearly 100 trillion cells (1014). So the odds are quite good that we have a couple of cells with a mutated form of almost any gene. A test tube can hold millions of bacteria, so, again, the odds are quite good that there will be mutant forms among them.

The mathematical problem for evolution comes when you want a series of related mutations. The odds of getting two mutations that are related to one another is the product of the separate probabilities: one in 107 x 107, or 1014. That&#8217;s a one followed by 14 zeroes, a hundred trillion! Any two mutations might produce no more than a fly with a wavy edge on a bent wing. That&#8217;s a long way from producing a truly new structure, and certainly a long way from changing a fly into some new kind of organism. You need more mutations for that. So, what are the odds of getting three mutations in a row? That&#8217;s one in a billion trillion (1021). Suddenly, the ocean isn&#8217;t big enough to hold enough bacteria to make it likely for you to find a bacterium with three simultaneous or sequential related mutations.

What about trying for four related mutations? One in 1028. Suddenly, the earth isn&#8217;t big enough to hold enough organisms to make that very likely. And we&#8217;re talking about only four mutations. It would take many more than that to change a fish into a philosopher, or even a fish into a frog. Four mutations don&#8217;t even make a start toward any real evolution. But already at this point some evolutionists have given up the classic idea of evolution, because it just plainly doesn&#8217;t work.

It was at this level (just four related mutations) that microbiologists gave up on the idea that mutations could explain why some bacteria are resistant to four different antibiotics at the same time. The odds against the mutation explanation were simply too great, so they began to look for another mechanism&#8212;and they found it. First of all, using cultures that are routinely kept for long periods of time, they found out that bacteria were resistant to antibiotics, even before commercial antibiotics were &#8220;invented.&#8221; Genetic variability was &#8220;built right into&#8221; the bacteria. Did the nonresistant varieties get resistant by mutation? No. Resistant forms were already present. Furthermore, certain bacteria have little rings of DNA, called plasmids, that they trade around among themselves, and they passed on their resistance to antibiotics in that way. It wasn&#8217;t mutation and asexual reproduction at all, just ordinary recombination and variation within kind.

Mutations
 
Cbirch hate to hit you with all of this again but keep reading.

DNA Mutation Rates

and

Evolution

Sean D. Pitman M.D.
©August, 2001
(Updated August 2008)







Table of Contents

Mitochondrial DNA Mutation Rates

Nuclear DNA Mutation Rates

Detrimental Mutation Rates

and the Degeneration of Mankind

The Living Dead

The Y-Chromosome Extinction?

Home























Mitochondrial DNA Mutation Rates




Mitochondria are "organelles" within living cells that are responsible for making the currency of energy called ATP (Adenosine Triphosphate), which all cells need to function. Mitochondria carry their own separate DNA (mtDNA) that is independent of the nuclear DNA of the same cell. Human mtDNA is composed of 37 genes totaling about 16,000 base pairs. This mtDNA also mutates at a much faster rate than nuclear DNA (nucDNA) does. Human mtDNA has been completely mapped and all the coding regions are known (As well as the proteins or RNA for which they code). Some of the mtDNA does not code for anything (thought to make these sections immune from "natural selection pressure"), and are known as the "control regions". One particular region appears to mutate faster than any other region (1.8 times faster), because the variation among humans is greatest here.4 When the cell divides, each cell takes some of the mitochondria with it. The mitochondria replicate themselves independently within the cell. Beyond this, it has been generally assumed that mitochondria are always passed on from the mother to the offspring without being involved with genetic shuffling and recombining of mtDNA with the mtDNA of the father. Recently, however, this notion has been challenged. As it turns out, many cases of paternally derived mtDNA have been detected in modern families of humans as well as other species. Consider the findings of an interesting study published by Schwartz and Vissing in the 2002 issue of the New England Journal of Medicine:



"Mammalian mitochondrial DNA (mtDNA) is thought to be strictly maternally inherited. Sperm mitochondria disappear in early embryogenesis by selective destruction, inactivation, or simple dilution by the vast surplus of oocyte mitochondria. . . The underlying mechanism responsible for the elimination of sperm mtDNA in normal embryos is not well understood. We speculate that the process in some cases may be defective, allowing sperm mitochondria to survive and giving those with a selective advantage the possibility of prevailing in certain tissues. . . Very small amounts of paternally inherited mtDNA have been detected by the polymerase chain reaction (PCR) in mice after several generations of interspecific backcrosses. Studies of such hybrids and of mouse oocytes microinjected with sperm support the hypothesis that sperm mitochondria are targeted for destruction by nuclear-encoded proteins. We report the case of a 28-year-old man with mitochondrial myopathy due to a novel 2-bp mtDNA deletion in the ND2 gene (also known as MTND2), which encodes a subunit of the enzyme complex I of the mitochondrial respiratory chain. We determined that the mtDNA harboring the mutation was paternal in origin and accounted for 90 percent of the patient's muscle mtDNA."47



So, what does such a finding mean with regard to mtDNA mutation rates and molecular clocks? Well, consider the following comments by Morris and Lightowlers published in a 2000 edition of The Lancet:



Mitochondrial DNA (mtDNA) is generally assumed to be inherited exclusively from the motherâ&#8364;¦. Several recent papers, however, have suggested that elements of mtDNA may sometimes be inherited from the father. This hypothesis is based on evidence that mtDNA may undergo recombination. If this does occur, maternal mtDNA in the egg must cross over with homologous sequences in a different DNA molecule; paternal mtDNA seems the most likely candidateâ&#8364;¦. If mtDNA can recombine, irrespective of the mechanism, there are important implications for mtDNA evolution and for phylogenetic studies that use mtDNA. 48



Before this evidence of paternal inheritance was discovered it was assumed that mtDNA was strictly the result of maternal inheritance. Based on this assumption, it was assumed that the mitochondrial offspring would get exact copies of the mitochondria that the mother had except if there was a mutational error. This error rate in the non-coding portion of mitochondrial DNA has long been thought to occur once every 300 to 600 generations, or every 6,000 to 12,000 years for humans.

The Berkeley biochemists who developed the theory, Allan Wilson, Rebecca Cann, and Mark Stoneking, made several apparently reasonable assumptions. Since there were no DNA changes due to genetic recombination events (ie: with paternal DNA - now known to be a wrong assumption), they assumed that all changes in the mtDNA were the result of mutations over time and that these mutations occurred at a constant rate. On the basis of these assumptions, the researchers believed they had access to something like a "molecular clock." Because mtDNA is thought to mutate faster than nuclear DNA (nucDNA), it was thought that the faster mutation rate of mtDNA would make for more accurate time keeping than nucDNA.

The original 1987 study involved mtDNA from 136 women from many parts of the world having various racial backgrounds. The analysis seemed to support the idea of a single ancestral mtDNA molecule from a woman living in sub-Saharan Africa about 200,000 years ago. Later, more detailed studies seemed to confirm this conclusion. Unfortunately though, there was a undetected bias in the computer program as well as with the researchers themselves. The researchers used a computer program designed to reveal a "maximum parsimony" phylogeny or the family tree with the least number of mutational changes. This was based on the assumption that evolution would have taken the most direct and efficient path (which is not necessarily true, or even likely). Also, the computer program was biased by the order of data entry to favor the information entered first. This problem was recognized when the computer gave different results depending on the order that the data was entered. Now, after thousands of computer runs with the data entered randomly, it appears that the "African origin" for modern humans does not hold a statistical significance over other possibilities.26

The problems with these studies were so bad that Henry Gee, a member of the editorial staff for the journal, Nature, harshly described the studies as "garbage." After considering the number of sequences involved (136 mtDNA sequences), Gee calculated that the total number of potentially correct parsimonious trees is somewhere in excess of one billion.25 Geneticist Alan Templeton (Washington University) suggests that low-level mixing among early human populations may have scrambled the DNA sequences sufficiently so that the question of the origin of modern humans and a date for "Eve" can never be settled by mtDNA.22 In a letter to Science, Mark Stoneking (one of the original researchers) acknowledged that the theory of an "African Eve" has been invalidated.23

Another interesting aspect of the "molecular clock" theory is the way in which the mutation rate itself was determined. Contrary to what many might think, the mutation rate was not initially determined by any sort of direct analysis, but by supposed phylogenic evolutionary relationships between humans and chimps. In other words, the mutation rate was calculated based on the assumption that the theory in question was already true. This is a rather circular assumption and as such all results that are based on this assumption will be consistent with this assumption - like a self-fulfilling prophecy. Since the rate was calculated based on previous assumptions of evolutionary time, then the results will automatically "confirm" the previous assumptions. If one truly wishes independent confirmation of a theory, then one cannot calibrate the confirmation test by the theory, or any part of the theory, that is being tested. And yet, this is exactly what was done by scientists such as Sarich, one of the pioneers of the molecular-clock idea. Sarich began by calculating the mutation rates of various species "...whose divergence could be reliably dated from fossils." He then applied that calibration to the chimpanzee-human split, dating that split at from five to seven million years ago. Using Sarich's mutation calibrations, Wilson and Cann applied them to their mtDNA studies, comparing "...the ratio of mitochondrial DNA divergence among humans to that between humans and chimpanzees."24 By this method, they calculated that the common ancestor of all modern humans, the "African Eve", lived about 200,000 years ago.

Obviously then, these dates, calculated from the mtDNA analysis, must match the presupposed evolutionary time scale since the calculation is based on this presupposition. The circularity of this method is inconsistent with good scientific method and is worthless as far as independent predictive value is concerned. The "mitochondrial clock" theory was and is basically a theory within a theory in that it has no independent predictive power outside of the theory of evolution. It is surprising then that scientists did not catch this inherent flaw earlier. Interestingly enough though, this flaw in reasoning was not detected for many years and perhaps would have remained undetected for much longer if a more direct mutation-rate analysis had not been done.

Eventually, scientists, who study historical families and their genetic histories, started questioning the mutation rates that were based on evolutionary phylogenetic assumptions. These scientists were "stunned" to find that the mutation rate was in fact much higher than previously thought. In fact it was about 20 times higher at around one mutation every 25 to 40 generations (about 500 to 800 years for humans). It seems that in this section of the control region, which has about 610 base pairs, humans typically differ from one another by about 18 mutations. 3 By simple mathematics, it follows that modern humans share a common ancestor some 300 generations back in time. If one assumes a typical generation time of about 20 years, this places the date of the common ancestor at around 6,000 years before present. But how could this be?! Thomas Parsons seems just as mystified. Consider his following comments published April of 1997, in the journal Nature Genetics:





"The rate and pattern of sequence substitutions in the mitochondrial DNA (mtDNA) control region (CR) is of central importance to studies of human evolution and to forensic identity testing. Here, we report a direct measurement of the intergenerational substitution rate in the human CR. We compared DNA sequences of two CR hypervariable segments from close maternal relatives, from 134 independent mtDNA lineages spanning 327 generational events. Ten substitutions were observed, resulting in an empirical rate of 1/33 generations, or 2.5/site/Myr. This is roughly twenty-fold higher than estimates derived from phylogenetic analyses. This disparity cannot be accounted for simply by substitutions at mutational hot spots, suggesting additional factors that produce the discrepancy between very near-term and long-term apparent rates of sequence divergence. The data also indicate that extremely rapid segregation of CR sequence variants between generations is common in humans, with a very small mtDNA bottleneck. These results have implications for forensic applications and studies of human evolution.

The observed substitution rate reported here is very high compared to rates inferred from evolutionary studies. A wide range of CR substitution rates have been derived from phylogenetic studies, spanning roughly 0.025-0.26/site/Myr, including confidence intervals. A study yielding one of the faster estimates gave the substitution rate of the CR hypervariable regions as 0.118 +- 0.031/site/Myr. Assuming a generation time of 20 years, this corresponds to ~1/600 generations and an age for the mtDNA MRCA of 133,000 y.a. Thus, our observation of the substitution rate, 2.5/site/Myr, is roughly 20-fold higher than would be predicted from phylogenetic analyses. Using our empirical rate to calibrate the mtDNA molecular clock would result in an age of the mtDNA MRCA of only ~6,500 y.a., clearly incompatible with the known age of modern humans. Even acknowledging that the MRCA of mtDNA may be younger than the MRCA of modern humans, it remains implausible to explain the known geographic distribution of mtDNA sequence variation by human migration that occurred only in the last ~6,500 years." 27





The calculation is done in the following way: Let us consider two randomly chosen human beings. Assuming all human beings initially have identical mitochondrial DNA, after 33 generations, two such random human families will probably differ by two mutations, since there will be two separate lines of inheritance and probably one mutation along each line. After 66 generations, two randomly chosen humans will differ by about four mutations. After 100 generations, they will differ by about six mutations. After 300 generations, they will differ by about 18 mutations, which is about the observed value.

These experiments are quite concerning to evolutionists who previously calculated that the â&#8364;&#339;mitochondrial eveâ&#8364; (whoâ&#8364;&#8482;s mitochondria is thought to be the ancestor mitochondria to all living humans) lived about 100,000 to 200,000 years ago in Africa.1 The new calculations, based on the above experiments, would make her a relatively young ~6,500 years old. Now, the previous notion that modern humans are up to 10,000 generations old has to be reevaluated or at least the mtDNA basis for that assumption has to be reevaluated - and it has been.2

More recent direct mtDNA mutation rate studies also seem to confirm the earlier findings by Parsons and others. In an 2001 article published in the American Journal of Human Genetics, Evelyne Heyer et. al., presented their findings of the mtDNA mutation rate in deep-rooted French-Canadian pedigrees.



Their findings "Confirm[ed] earlier findings of much greater mutation rates in families than those based on phylogenetic comparisons. . . For the HVI sequences, we obtained 220 generations or 6,600 years, and for the HVII sequences 275 generations or 8,250 years. Although each of these values is associated with a large variance, they both point to ~7,000-8,000 years and, therefore, to the early Neolithic as the time of expansion [mostly northern European in origin] . . . Our overall CR mutation-rate estimate of 11.6 per site per million generations . . . is higher, but not significantly different, than the value of 6.3 reported in recent the recent pedigree study of comparable size . . . In another study (Soodyall et al. 1997), no mutations were detected in 108 transmissions. On the other hand, two substitutions were observed in 81 transmissions by Howell et al. (1996), and nine substitutions were observed in 327 transmissions by Parsons et al. (1997). Combining all these data (1,729 transmissions) results in the mutation rate of 15.5 (Cl 10.3-22.1). Taking into account only those from deep-rooting pedigrees (1,321 transmissions) (Soodyall et al. 1997; Sigurdardottir et al. 2000; the present study) leads to the value of 7.9. The latter, by avoiding experimental problems with heteroplasmy, may provide a more realistic approximation of the overall mutation rate." 44



Also, consider an even more recent paper published in a 2003 issue of the Annals of Human Genetics by B. Bonne-Tamir et al. where the authors presented their results of a their study of "Maternal and Paternal Lineages" from a small isolated Samaritan community. In this paper they concluded:



"Compared with the results obtained by others on mtDNA mutation rates, our upper limit estimate of the mutation rate of 1/61 mutations per generation is in close agreement with those previously published." 45 [compared with the rate determined by Parsons of 1/33 generations, a rate of 1/61 is no more than double]



One more interesting paper published in September 2000 in the Journal Scientist by Denver et al. is also quite interesting. These scientists reported their work with the mtDNA mutation rates of nematode worms and found that these worm's molecular clocks actually run about "100 times faster than previously thought" [emphasis added].46



"Extrapolating the results directly to humans is not possible, say the scientists. But their results do support recent controversial studies suggesting that the human molecular clock also runs 100 times faster than is usually thought. This may mean that estimates of divergence between chimpanzees and humans, and the emergence of modern man, happened much more recently than currently believed, says the team. 'Our work appears to support human analyses, which have suggested a very high rate,' says Kelley Thomas of the University of Missouri. 'This work is relevant to humans,' says Doug Turnbill of the institute for Human Genetics and Newcastle University, UK. 'If the human mutation rate is faster than thought, it would have a lot of impact in looking at human disease and forensics, as well as the evolutionary rate of humans.' . . .

Mutation rates of mtDNA in humans are usually estimated by comparing sequences of DNA from people and other animals. 'This is kind of analysis that was used to determine that the African origin of modern humans was about 200,000 years ago,' says Thomas. 'The problem with this approach is that you are looking at both the mutation rate and the effects of natural selection,' he says. The technique would also miss multiple mutations in the same stretch of mtDNA, says Paul Sharp of the Institute of Genetics at Nottingham University, UK.

More recent studies have looked at the mtDNA of people who are distantly related but share a female ancestor. This approach has revealed higher mtDNA mutation rates. But the results have not been accepted by many scientists [emphasis added].

Knowing the exact rate of mutation in humans is very important for forensic science and studies of genetic disease, stresses Turnbill. Forensic identification often rests on comparing samples of DNA with samples from suspected relatives. Faster human molecular clocks could complicate established exact relationships, he says." 46



Obviously then, these rates, based on more direct observations, are nowhere near those based on indirect evolutionary assumptions. This certainly does "complicate" things just a bit now doesn't it? Isn't it strange though that many scientists are still loath to accept these results? The bias in favor of both evolution as well as millions of years for assumed divergence times between creatures like apes and humans is so strong that changing the minds of those who hold such positions may be pretty much impossible.

There are many other potential problems for phylogenies that rely on mtDNA sequence analysis and mutation rates. One problem is that mtDNA functions as a single genetic locus, much like a single gene does in nucDNA. Studies that work off a single genetic locus are more likely to be affected by random genetic changes than are studies that include more than one locus (the more the better). Therefore, single locus studies are less accurate in characterizing a population. Beyond this, the new evidence for paternal mtDNA mixing is quite problematic.16

Also, as briefly discussed above, the use of control regions as a "molecular clock" may not be as valid as was previously hoped. Some nucleotide regions mutate slowly, while others can mutate relatively rapidly.17 These mutational "hotspots" can mutate fairly rapidly even within a single lifetime and are intuitively rather common in the aged.18 Of course such "somatic" mutations arise in mitochondria of various bodily tissues and, unless they involve gametes, they are not passed on to the next generation. However, they would still affect phylogenetic interpretations. Scientists have tried to compensate for these problems, but the various methods have produced divergent results.19 Also, as discussed above, direct comparisons of modern sequences with historical sequences often yield very difference results from those estimated by indirect methods that are based on present day sequence differences. For another example from a different species, direct comparisons of modern penguins with historically sequenced penguins have shown that their mtDNA mutation rates are 2 to 7 times faster than had previously been assumed through indirect methods.20 Certain of these problems have in fact led some scientists to stop using control-region sequences to reconstruct human phylogenies.21

Those scientist that continue to try and revise the molecular clock hypothesis have tried to slow down the clock by showing that some mtDNA regions mutate much more slowly than do other regions. The problem here is that such regions are obviously affected by natural selection. In other words, they are not functionally neutral with regard to selection pressures.

For example, real time experiments have shown that average mitochondrial genome mutation rates are around 6 x 10-8 mut/site/mitochondrial generation - in line with various estimates of average bacterial mutation rates (Compare with nDNA rate of 4.4 x 10-8 mut/site/human generation). With an average generation time of 45 days, that's about 5 x 10-6 mut/site/year and 5 mut/site/myr.

This is about twice as high as Parsons' rate of 2.5/mut/site/myr and about 40 to 50 times higher than rates based on phylogenetic comparisons and evolutionary assumptions. And, this is the average rate of the entire mitochondrial genome of 16,000pb. One might reasonably think that all aspects of the hypervariable regions (HVI & HVII) would have a higher than average rate of mutation if truly neutral with regard to functional selection pressures. Given this, those "slowly mutating sites" with rates as slow as 0.065 mut/site/myr (Heyer et al, 2001) would seem to be maintained in a biased way by natural selection.

Again, such non-neutral changes are not necessarily the reflection of elapsed time since a common ancestor so much as they are the reflection of the different functional needs of different creatures in various environments.









Nuclear DNA Mutation Rates





As with mitochondrial DNA mutation rates, the mutation rates of nuclear DNA have often been calculated based on evolutionary scenarios rather than on direct methods. By such methods, the average mutation rate for eukaryotes in general is estimated to be about 2.2 x 10-9 mutations per base pair per year.29 With a 20 year average generation time for humans, this works out to be around 4.4 x 10-8 mutations per base pair per generation. Since most estimates of the size of the diploid human genome run around 6.3 billion base pairs, this mutation rate would give the average child around 277 mutational differences from his or her parents. This sounds like quite a high number and it is in fact on the high end of the spectrum when compared to studies looking more specifically at human mutation rates verses eukaryotic mutation rates in general. A particular study by Nachman and Crowell estimated the average mutation rate specifically in humans by comparing control sequences in humans and chimpanzees. Using these sequence comparisons, "The average mutation rate was estimated to be ~2.5 x 10-8 mutations per nucleotide site or 175 mutations per diploid genome per generation" [Based on a higher diploid genome estimate of 7 billion base pairs]. 30

These non-direct mutation rate estimates might actually seem reasonable given that they seem to match the error rates of DNA replication that occur between the formation of a zygote in one generation and the formation of a zygote in the next generation. In the illustration31 below, notice that from fertilization to the formation of a woman's first functional gamete, it takes about 23 mitotic divisions. Men, on the other hand, contribute about twice as many germ line mutations as women do.33 At least part of the reason is that their stem cells keep dividing so that the older a man gets before having children more mitotic divisions occur.













Now, consider that each diploid fertilized zygote contains around 6 billion base pairs of DNA (~3 billion from each gamete/parent, using a conservative round number).32 From cell division to cell division, the error rate for DNA polymerase combined with other repair enzymes is about 1 mistake in 1 billion base pairs copied.42 At this rate, there are about 6 mistakes with each diploid cell replication event. With a male/female average of 29 mitotic divisions before the production of the next generation, this works out to be about 175 mutations per generation.

Of course, this is right in line with the mutation rates that are based on evolutionary scenarios. However, some estimates place the overall mutation rate as low as 1 mistake in 10 billion base pairs copied.43 At this rate, one would expect around 0.6 mistakes with each replication event and only around 17 mutations per person per generation. So, perhaps something else is going on that also influences the nuclear DNA mutation rate? As it turns out, replication errors are not the only sources of DNA mutations. Damage to DNA can and does often occur spontaneously. Genome stability is continually challenged by a diverse array of mutagenic forces that include errors during DNA replication, environmental factors such as UV radiation, and endogenous mutagens such as oxygen free radicals generated during oxidative metabolism. This damage must also be detected and repaired on a constant basis. Of course, this repair isn't perfect and therefore likely contributes significantly to the actual mutation rate far over that estimated by the indirect methods discussed above.

In fact, the actual observed mutation rate is likely to be quite a bit higher than 1 x 10-8 per generation - - at least 10 fold higher and by some estimates (see Link). Such high mutation rates are based on actual observed rates of functional mutations in the human genome and directly observed mutation rates in pseudogenes - such as those found in C. elegans (see further discussion below).

Again, consider that the rate of 1 x 10-9 per year that is referenced above is an indirect estimate based on evolutionary assumptions of the time since the MRCA between two species. This particular commonly-referenced rate is based on the supposed time since the MRCA between the two species and the comparison of sequences which are though to be functionally neutral.


"Comparisons of pseudogenes and of synonymous sites between humans and chimpanzees have suggested mutation rates on the order of 10-8 per site per generation [ or about 10-9 per site per year] (e.g., KONDRASHOV and CROW 1993 Down; DRAKE et al. 1998 Down)." (see Link)



So, you see, mutation rate estimates based on this sort of evolutionary assumption produce a self-fulfilling prophecy when it comes to estimating the time of the MRCA between humans and apes based on mutation rate analysis. On the other hand, more direct methods of detecting the nuclear mutation rates in animals suggest that the actual rate is likely to be about ten times higher than estimates based on indirect methods and evolutionary assumptions. Consider the following excerpt from Denver et. al. published in Nature in 2004:


"Alternative approaches in mammals, relying on phylogenetic comparisons of pseudogene loci and fourfold degenerate codon positions, suffer from uncertainties in the actual number of generations separating the compared species and the inability to exclude biases associated with natural selection. Here we provide a direct and unbiased estimate of the nuclear mutation rate and its molecular spectrum with a set of C. elegans mutation-accumulation lines that reveal a mutation rate about tenfold higher than previous indirect estimates and an excess of insertions over deletions." (see Link)



Such a high mutation rate, a rate of at least 2000 per person per generation, might be a more of a problem than it is for humans if it were not for the fact that much of the human genome is thought by most scientists to have no significant functional role and can therefore sustain mutations without significant detrimental effects on the overall function of the organism. The amount of this non-functional DNA has been estimated by calculating the coding portion of DNA and subtracting this from the total genomic real estate. It seems as though the average coding portion of a human gene is around 1,350 base pairs in size. Of course, this gene would code for a protein averaging 450 amino acids.38 Now, multiplying this number by the total number of genes should give a reasonable estimate of the coding genetic real estate.

However, there is some argument as to the total number of genes in the human genome. For many years it was thought that humans had between 70,000 to 140,000 genes. However, scientists working on the human genome project made a surprising discovery. When they finished the project in February of 2001, they estimated that the actual gene count was somewhere between 30,000 to 40,000 genes.39 But a year later, in February of 2002, at the annual meeting of the American Association for the Advancement of Science (publisher of Science), one of the presenters, Victor Velculescu, suggested that the real number of genes in the human genome may actually be closer to 70,000 genes after all. He and his colleagues, at Johns Hopkins University in Baltimore, Maryland, have gone back to the lab to look for genes that the computer programs may have missed. Their technique, called serial analysis of gene expression (SAGE), works by tracking RNA molecules back to their DNA sources. After isolating RNA from various human tissues, the researchers copy it into DNA, from which they cut out a kind of genetic bar code of 10 to 20 base pairs. Velculescu proposes that the vast majority of these tags are unique to a single gene. If so, the tags can then be compared to the human genome to find out if they match up with genes discovered by the computer algorithms. Velculescu stated that only about half of the tags he used match the genes identified earlier in the genome project. Therefore, he suggests that the human inventory of genes had been underestimated by about half.

The reason for the disparity may be that the standard computer programs were largely developed for the genomes of simple (prokaryotic) organisms, not for the more complex sequences found in the genomes of humans and other eukaryotes. "We're still not very good at predicting genes in eukaryotes," said Claire Fraser of The Institute for Genomic Research in Rockville, Maryland. â&#8364;&#339;It is entirely possible that there could be more than 32,000 genes, and SAGE is an important approach to finding themâ&#8364;¦ You absolutely have to go back into the lab and get away from the computer terminal." 40

So, it seems as though there is still some question as to exactly how many genes the human genome contains. This is especially true now that non-protein-coding portions of DNA, like many so-called pseudogenes and miro-RNAs, are now being found to have significant functionality. But, for the sake of argument, lets go with a lower estimate of ~40,000 genes. With each gene averaging 1,350 base pairs in size, only around 108 million base pairs out of 6 billion base pairs (diploid) would code for anything. This is only around 1.8% of the total genome. Much of the rest of the human genome (At least 50%) is thought to be composed of a large amount of â&#8364;&#339;repetitive DNAâ&#8364; that is made up of similar sequences occurring over and over.33,38 At least some of the other 48% of the genome is thought to provide structural integrity as well as regulating the production of the coding sequences of DNA as far as when, where, and how much protein to make. However, exactly how much of the non-protein-coding genome is functional is not clearly understood but may be quite high (i.e., well over 50% with at least some functionality; see Link). In any case, for the purposes of this discussion a rough figure of 2% will be used as the amount of functional DNA in the human genome.





The Detrimental Mutation Rate

and the Genetic Deterioration of Mankind




Since mutations are the only possible source of novel genomic function in the evolution of living things, we should consider a few facts about these mutations. Mutations are thought to be purely random events causes by errors of replication and maintenance over time. They occur anywhere in the entire genome in a fairly random fashion with each generation. Given this information, lets consider how these mutations would build up and what effect, if any, they would have on a human lineage.

Some researchers suggests a detrimental mutation rate (Ud) of 1 to 3 per person per generation with at least some scientists (Nachmann and Crowell, 2000) favoring at least 3 or more.30 Notice that these detrimental mutation rates are based on overall DNA mutation rate estimates that are indirectly determined based on assumed evolutionary relationships. The actual mutation rates, as noted above, are likely to be much higher. In any case, even given these assumptions, since detrimental mutations outnumber beneficial mutations by at least 1,000 to 1, it seems like the build up of detrimental mutations in a population might lead toward extinction. 34,36

Nachmann and Crowell detail the perplexing situation at hand in the following conclusion from their fairly recent paper on human mutation rates:





The high deleterious mutation rate in humans presents a paradox. If mutations interact multiplicatively, the genetic load associated with such a high U [detrimental mutation rate] would be intolerable in species with a low rate of reproduction [like humans and apes etc.] . . .

The reduction in fitness (i.e., the genetic load) due to deleterious mutations with multiplicative effects is given by 1 - e -U (Kimura and Moruyama 1966). For U = 3, the average fitness is reduced to 0.05, or put differently, each female would need to produce 40 offspring for 2 to survive and maintain the population at constant size. This assumes that all mortality is due to selection and so the actual number of offspring required to maintain a constant population size is probably higher.

The problem can be mitigated somewhat by soft selection or by selection early in development (e.g., in utero). However, many mutations are unconditionally deleterious and it is improbable that the reproductive potential on average for human females can approach 40 zygotes. This problem can be overcome if most deleterious mutations exhibit synergistic epistasis; this is, if each additional mutation leads to a larger decrease in relative fitness. In the extreme, this gives rise to truncation selection in which all individuals carrying more than a threshold number of mutations are eliminated from the population. While extreme truncation selection seems unrealistic [the death of all those with a detrimental mutational balance], the results presented here indicate that some form of positive epistasis among deleterious mutations is likely.30





Nachmann and Crowell find the situation a very puzzling one. How does one get rid of all the bad mutations faster than they are produced? Does their hypothesis of â&#8364;&#339;positive epistasisâ&#8364; adequately explain how detrimental mutations can be cleared faster than they are added to a population? If the functional effects of mutations were increased in a multiplicative instead of additive fashion, would fewer individuals die than before? As noted above, even if every detrimental mutation caused the death of its owner, the reproductive burden of the survivors would not diminish, but would remain the same.

For example, lets say that all those with at least three detrimental mutations die before reproducing. The population average would soon hover just above 3 deleterious mutation rates. Over 95% of each subsequent generation would have 3 or more deleterious mutations as compared with the original "neutral" population. The death rate would increase dramatically. In order to keep up, the reproductive rates of those surviving individuals would have to increase in proportion to the increased death rate. The same thing would eventually happen if the death line were drawn at 100, 500, 1000, 10000 or more deleterious mutations. The only difference would be the length of time it would take a given population to build up a lethal number of deleterious mutations in its gene pool beginning at a relatively "neutral" starting point. The population might survive fairly well for many generations without having to resort to huge increases in the reproduction rate. However, without getting rid of the accumulating deleterious mutations, the population would eventually find itself experiencing an exponential rise in its death rate as its average population crossed the line of lethal mutations.

Since the theory of positive epistasis does not seem to help the situation much, some other process must be found to explain how to preferentially get rid of detrimental mutations from a population. Consider an excerpt from a fairly recent Scientific American article entitled, "Mutations Galore":





According to standard population genetics theory, the figure of three harmful mutations per person per generation implies that three people would have to die prematurely in each generation (or fail to reproduce) for each person who reproduced in order to eliminate the now absent deleterious mutations [75% death rate]. Humans do not reproduce fast enough to support such a huge death toll. As James F. Crow of the University of Wisconsin asked rhetorically, in a commentary in â&#8364;&#732;Natureâ&#8364;&#8482; on Eyre-Walker and Keightley's analysis: â&#8364;&#339;Why aren't we extinct?â&#8364;

Crow's answer is that sex, which shuffles genes around, allows detrimental mutations to be eliminated in bunches. The new findings thus support the idea that sex evolved because individuals who (thanks to sex) inherited several bad mutations rid the gene pool of all of them at once, by failing to survive or reproduce.

Yet natural selection has weakened in human populations with the advent of modern medicine, Crow notes. So he theorizes that harmful mutations may now be starting to accumulate at an even higher rate, with possibly worrisome consequences for health. Keightley is skeptical: he thinks that many mildly deleterious mutations have already become widespread in human populations through random events in evolution and that various adaptations, notably intelligence, have more than compensated. â&#8364;&#339;I doubt that we'll have to pay a penalty as Crow seems to think,â&#8364; he remarks. â&#8364;&#339;We've managed perfectly well up until now." 37



Well, the answer might be found in a combination of processes where both sexual replication and natural selection play a role to keep a slowly reproducing population from going extinct. For example consider the following chart showing how deleterious mutations build up in a population that reproduces via asexual means: 49











Notice how the most fit "Progenitor Class" (P) loss numbers in each generation while the numbers of those that have greater numbers of deleterious mutations build up more and more. In this article Rice notes that in asexual populations the only way to really overcome this buildup of detrimental mutations is to increase the reproductive rate substantially. But, what about beneficial mutations? Rice comments, "Rare reverse and compensatory mutations can move deleterious mutations, via genetic hitchhiking, against the flow of genetic polarization. But this is a minor influence, analogous to water turbulence that occasionally transports a pebble a short distance upstream." 49 So, how do sexually reproducing populations overcome this problem?

When it comes to sexually reproducing populations, the ability for genetic recombination during the formation of gametes makes it possible to concentrate both good and bad mutations. For example, lets say we have two individuals, each with 2 detrimental mutations. Given sexual recombination between these two individuals, there is a decent chance that some of their offspring (1 chance in 32) will not have any inherited detrimental mutations. But what happens when the rate of additional detrimental mutations is quite high - higher than 3?

To look into this just a bit more, consider another example of a steady state population of 5,000 individuals each starting out with 7 detrimental mutations and an average detrimental mutation rate of 3 per individual per generation. Given a reproductive rate of 4 offspring per each one of the 2,500 couples (10,000 offspring), in one generation, how many offspring will have the same or fewer detrimental mutations than the parent generation?



Inherited
After Ud = 3

7
901

6
631

5
378

4
189

3
76

2
23

1
5

0
0.45

< or = 7
2202




This Poisson approximation shows that out of 10,000 offspring, only 2,202 of them would have the same or less than the original number of detrimental mutations of the parent population. This leaves 7,798 with more detrimental mutations than the parent population.51 Of course, in order to maintain a steady state population of 5,000, natural selection must cull out 5,000 of these 10,000 offspring before they are able to reproduce. Given a preference, those with more detrimental mutations will be less fit by a certain degree and will be removed from the population before those that are more fit (less detrimental mutations). Given strong selection pressure, the second generation might be made up of ~2,200 more fit individuals and only ~2,800 less fit individuals with the overall average showing a decline as compared with the original parent generation. If selection pressure is strong, so that the majority of those with more than 7 detrimental mutations are removed from the population, the next generation will only have about 1,100 mating couples as compared to 2,500 in the original generation. With a reproductive rate of 4 per couple, only 4,400 offspring will be produced as compared to 10,000 originally. In order to keep up with this loss, the reproductive rate must be increased or the population will head toward extinction. In fact, given a detrimental mutation rate of Ud = 3 in a sexually reproducing population, the average number of offspring needed to keep up would be around 20 per breeding couple (2eUd/2). While this is about half that required for an asexual population (2eUd), it is still quite significant.

In this light, consider that more recent estimates suggest that the deleterious mutation rate is even higher. "Extrapolations from studies of humans and Drosophila (Mukai, 1979; Kondroshov, 1988; Crow, 1993) suggest that Ud > 5 is feasible." 49 However, the number of required offspring needed to compensate for a detrimental mutation rate of Ud = 5 would soar to 148 per female per generation! And, this is not the worst of it. Recent genetic studies have shown that much of what was once thought of as "junk DNA" is actually functional ( Link ). In fact, these recent studies suggest that the total amount of functional DNA in the human genome is not actually 2-3% as previously thought, but is upwards of 85-90% ( Link ). Consider also that what were once thought to be neutral mutations are now being discovered to be functional mutations governed by natural selection. In a 2007 paper published in the Indian Journal of Human Genetics, author Clyde Winters claims to have made a very interesting discovery.



It is often assumed that selection plays a limited role in the mtDNA control region. . . However, there is a selective constraint on mutation frequencies of an mtDNA site. Some of the East African transitions . . . are the most rapidly occurring nucleotide substitutions in the human mitochondrial genome. These transitions are often referred too as "hotspots." These hot spots of mutational activity suggest that positive selection influences mutation rates and not neutral selection which, theoretically, would manifest parallel mutations.53





Of course, this is not the only region in the human genome that was once thought to be limited to neutral mutations alone. Much of the genome is now known to be subject to differential selection.

So what. What does this matter? It matters to this particular problem because the actual detrimental mutation rate would be a significantly greater percentage of the total number of mutations experienced by the genome in each generation. As noted above, the total number of mutations per offspring per generation is at least 175. If the functional genome percentage was actually 50% (instead of just 2%), the likely detrimental mutation rate (Ud) would be well over 30 instead of the usual estimates of ~3 noted above. This would increase the reproductive rate needed to avoid genomic decay from ~20 offspring per woman per generation to well over 10 trillion offspring per woman per generation - obviously an impossible hurdle to overcome.

In short, the best available evidence overwhelmingly supports the theory that the human genome is in decay. The various forms of "positive epistasis" (see illustration by Rice below) 49 do not solve this problem.















The Y-Chromosome Rapidly Headed for Extinction?



Also, what about the Y-chromosome in males? The Y-chromosome does not undergo significant sexual recombination. Are the males of slowly reproducing species, like humans, therefore headed for extinction at an even faster rate than females?



"The absence of recombination with a homologous partner means that it [The Y-chromosome] can never be â&#8364;&#732;repairedâ&#8364;&#8482; by recombination. This has led to suggestions that the Y is destined for extinction â&#8364;&#8220; it will eventually dwindle to nothing. According to this model, its role in sex determination will eventually be taken on by genes elsewhere in the genome." 50



The author of the above quoted article goes onto point out that several species, like the Armenian mole vole, are able to reproduce without the Y chromosome. While this might explain where humans are headed, it doesn't seem quite clear as to just how the Y-chromosome could have evolved over millions of years of time given its relative inability to combat high detrimental mutation rates.

DNA Mutation Rates and Evolution
 

Forum List

Back
Top