Conventional evolution is conceived as an incremental process of change. That has implications for the layout of self-perpetuating, self- replicating structures in configuration space.
The following are some notes on the object I refer to as the spongeam – this is a structure in
configuration space that is a necessary (but not sufficient) condition for conventional evolution. The spongeam is
a logical implication of evolution given evolution as it is currently understood.
Evolution is sometimes wrongly caricatured as an unguided chance process.
See for example the following statement by
IDist “niwrad” on the website Uncommon
Descent. In his UD post niwrad
expresses his dismay at theologian Alastair McGrath’s acceptance of “theistic
evolution”. In this quote I have added emphases where I believe niwrad errs:
Unguided evolution (whatever meaning we give it, and all the more
so if we give it a meaning based on Darwinian evolution, which is what theistic
evolutionists and McGrath mean) is a
theory based on chance, on randomness, i.e. accidents. If “the universe is
not an accident” — as he [McGrath] rightly believes — how can evolution, an engine of accidents, be an
explanation of “how the world started”, with the same plausibility of
creationism and ID?
“Unguided evolution…a theory
based on chance… an engine of accidents” are all phrases that thoroughly misrepresent the true situation with
evolutionary theory as it stands today. The answer to niwrad’s last question
has been provided many times on this blog: In short, conventional evolution is
not unguided but is in fact a random walk process which takes place within a tight envelope of constraint. Therefore McGrath’s apparent belief in
theistic evolution is consistent with a belief that the universe is not an accident. Let me expand….
Even though I myself have reservation about evolution (as currently
understood) it is certainly not a purely chance process. It is an ironic that
the very reason why I can’t easily dismiss evolution on the grounds of entropy
is precisely because it is not an unguided process; it is, in fact, a highly channelled
form of entropy: For just as life annexes and organises increasing amounts of
matter as it populates the world under the constraint of the information implicit
in its machinery and yet still keeps within the second law of thermodynamics,
so too would evolution. Conventional evolution can only work if “randomness” plays
out within a very small probabilistic envelope. (It is arguable – albeit only
arguable – that the high information implicit in this envelope is implicit
physics) It is ironic that not many in
the de facto ID community understand the conclusion of one of their very own
gurus, William Dembski, whose “Conservation of Information”, if applied to the evolution of life, can be written as:
Probability of life given our physical regime, P(Life
| Physical Regime) = p/q
Where:
p is the a priori probability of living
configurations; that is, the probability of selecting a member from the class
of living configurations assuming the principle of indifference over the space
of all possible configurations. The organised complexity of living things
clearly implies that p is going to be
extremely small.
q is the probability of selecting a physical
regime which enhances the chances of selecting life assuming the principle of
indifference over the space of cases consistent with this physical regime.
See this
paper for a “proof” of the above relation.
To give life a realistic probability of coming about the value of q, which is a measure of the tightness
of the constricting envelope, must be very small. It is this constricting
envelope which I identify as an object I call the spongeam. In the points below
I develop the concept of the spongeam stage by stage on the assumption that
standard ideas of evolution apply:
- We know that there is a subset of configurations which are stable enough to self-perpetuate and self-replicate (I refer to these self-perpetuating self-replicating structures or SPSRSs). One fairly compelling corollary of this definition of SPSRSs is that they must be highly organised to carry out their tasks and therefore constitute a very tiny set of possibility when compared to the overwhelming size of configuration space consistent with our physical regime.
- An important question is this: How does the set of SPSRSs populate configuration space? What does this pattern look like when viewed across configuration space?
- If conventional evolution has occurred then certain conditions must be true of this set… Viz: Because conventional evolution is stepping through configuration space in small steps it follows that for an SPSRS set to be favourable to evolution, then that set must be fully connected in as much as any one member must be no further than some typically maximum distance from at least one other member of the set; let’s call that distance s.
- If conventional evolution has been actualised in paleontological history it follows that H steps of typical step size s separates us from the first ancestors. That is, there must be at least one unbroken path between us and those antecedent structures.
- But even if this connected set exists there is another necessary condition of evolution: If we select any particular SPSRS and then amend it in m steps of typical size b, always ensuring that each step moves away from the selected SPSRS, then the high order of SPSRS implies that most paths will lead to non-viable structures – that is, structures that are not SPSRS. Thus, we have a second necessary condition for evolution: Viz: compared to the number of outgoing paths which lead to non-viability there must be a sufficient number of paths connecting an SPSRS to the rest of the set for it to have a realistic chance of stepping to another slightly different SPSRS. Therefore, the set of SPRSs must not only be connected but the connections must be sufficiently strong to give a realistic chance of random evolutionary stepping across the spongeam.
- There is one aspect of this matter which I consistently miss out for simplicity’s sake; this is the fact that populations of SPSRSs become their own environment and therefore there is a feedback relation between SPSRS’s and their environment. This feedback is likely to be non-linear and therefore gives rise to chaos, thus compounding the difficulties besetting human efforts to get an analytical handle on the spongeam.
Ignoring point 6, then the foregoing creates in my mind’s eye a kind of
static spongey looking structure in configuration space; that is, a network
linking the set of SPSRSs into a connected set. Viz:
If the Spongeam exists it might be the multidimensioanl equivalent of this 3d sponge.
If you imagine this structure to be far more attenuated than is shown
here, then the result would be an approximation to what I envisage to be the
kind of structure in configuration space which is a necessary condition of
evolution. The next question is; how does movement take place from one point
to another in the spongeam thereby giving us an evolutionary dynamic? The
answer to that question (and the answer I’ve assumed above) is that in standard
evolution the motion is one of random walk – that is, diffusional. But random walk or
not, it is clear that the tight constraint which the spongeam puts on the
random motions gives the lie niwrad’s claim that evolution is unguided. (See
also atheist Chris Nedin who, ironically, is also loath to admit that
evolution is an inevitably guided process. The ulterior motive here may be an
unwillingness to accept that any workable model of evolution, as William
Dembski has shown, must be tapping into some source of information).
The spongeam is more fundamental than the fitness surface. The fitness
value is kind of “field gradient” that permeates the network of the spongeam
and biases the random walk in places, but not necessarily everywhere (See here
for an example of an equation for biased random walk).
The above points, 1 to 6, are an annunciation of the very general
mathematical conditions which are logically necessary for conventional evolutionary
processes to take place. But I must voice my standard disclaimer here: My own bet is
that the minuscule set of viable self-perpetuating replicators is unable to populate
configuration space with a sufficient density to form a connected set which
facilitates standard evolution. Doubts on this point are the main reason why I
have branched out into my speculative Melencolia
I series.
What is the evidence for the spongeam? Since the spongeam is a necessary
condition for conventional evolution the evidence for it is as strong (or weak) as the
evidence for evolution. As I’m not a biologist I try to avoid commenting on the
strength or weaknesses of the observations for evolution. But these
observations are nevertheless important:
In view of the analytical intractability of the spongeam the human
alternative to analytically proving or disproving the existence of the speoneam
from first principles can only be found in observation.
One important point I would like to make here is that the existence of the
spongeam would mean that evolution works by using front loaded imperative information rather than the backloaded information of a declarative
computation. I explain the difference between frontloading and backloading
in this
paper.
Atheist Joe Felsenstein clearly does not accept evolution to be an “unguided engine of accidents” and
understands that it must be a front loaded process; this is evidenced in the
posts I did on his comments about the work of IDists Dembski, Ewart and Marks
(See here
and here).
Although Felsenstein may not think in terms of the spongeam he has nevertheless
shown it to be arguable that conventional physics is responsible for the depository
of information inherent in fitness surfaces. In the comments sections of my
posts I have just linked to he writes:
If the laws of physics are what
is responsible for fitness surfaces having "a huge amount of
information" and being "very rare object[s]" then Dembski has
not proven a need for Intelligent Design to be involved. I have not of course proven
that ordinary physics and chemisty is responsible for the details of life --
the point is that Dembski has not proven that they aren't.
Biologists want to know whether
normal evolutionary processes account for the adaptations we see in life. If
they are told that our universe's laws of physics are special, biologists will
probably decide to leave that debate to cosmologists, or maybe to theologians.
The issue for biologists is
whether the physical laws we know, here, in our universe, account for evolution.
The issue of where those laws came from is irrelevant to that.
Given that Dembski and colleagues appear to be working with an ancillary non-immanent concept of intelligence, Joe Felsenstein’s comments above are true: Dembski and co, using their
explanatory filter epistemic, work within an Intelligence vs natural forces conceptual framework. This frame
work is valid for ancillary intelligences
such as humans or aliens. Therefore should Joe Felsenstein plausibly
demonstrate that imperative information in the laws of physics is efficacious
in the generation of life the assumed position of many IDists becomes
problematical. These IDists have implicitly committed themselves to the view
that the imperative information in the laws of physics is classed as “natural”
and is therefore incapable of generating life. The motive behind this position is
that if physics should be unable to account for life (as they believe) they can
then invoke their explanatory filter to argue that life is the work of an ancillary
intelligence, or if you like an intelligence
of the gaps. However, for Felsenstein the imperative information in physics
is more likely to be due to some unknown physics
of the gaps.
It should be noted here that some IDists believe the imperative information
needed to generate life is provided in punctuated interventional installments
rather than in one grand slam act of Divine fiat or by the one-off creation of
the grand narrative of physics. See for example the posts here
and here.