William Dembski and Robert Marks’ latest published paper can be found by following a link from this Uncommon Descent post. Its content is very reminiscent of this paper which I considered here here and here*. In this latest paper, however, Marks and Dembski shift the focus to what they call Bernoulli’s principle of insufficient reason, a principle that justifies their assuming equal a priori probabilities over a space of possibilities. It is likely that they feel the need to justify this principle further as their conclusions regarding the “conservation of information” stand on this assumption. I raised this very issue in this discussion on UD where I said:
Yet another issue is this: When one reaches the boundary of a system with “white spaces” beyond its borders how does one evaluate the probability of the systems options and therefore the system’s information? Is probability meaningful in this contextless system? I am inclined to follow Dembski’s approach here of using equal a priori probabilities, but I am sure this approach is not beyond critique.
The choice of using Bernoulli’s PrOIR arises when one is faced with a set of possible outcomes for which there is no known reason why one outcome should be preferred over another; on this basis Bernoulli’s PrOIR then assigns equal probabilities to these outcomes. However, probability (as I proposed in my paper on probability) is a quasi-subjective variable and thus varies from observer to observer and also varies as an observer’s knowledge changes. In particular probability estimates are liable to change if knowledge of the context in which the outcomes are embedded increases. For example consider a six sided die. Initially, if we know nothing about the die then using PrOIR we may assign an equal probability of 1/6 to each side. However, if after, say, examination of the die we find it to be biased, or find that it repeatedly returns lopsided frequencies when it is thrown, then these larger contexts of observation have the effect of weighting the probabilities of the 6 possible outcomes. Thus given this outer context of observation we find that we can no longer impute equal probabilities to the six possible outcomes. But, and this is Dembski and Marks’s essential point, the cost of buying better than PrOIR results on the die is paid for by an outer system that itself has improbable lopsided probabilities. From whence comes this improbable skew on the outer system? This skew can only itself come from a wider context that itself is improbably skewed and so on. According to Dembski and Marks it is not possible, in realistically sized systems, to destroy this improbable loading. Ergo, some kind of conservation law seems to be operating, a conservation law Dembski and Marks call the conservation of information.
I believe Dembski and Marks’ essential thesis to be unassailable. Bernoulli’s PrOIR can be used to set up a null hypothesis that leads one to conclude that the extremely remote probabilities of living structures point to a kind of loaded die in favour of life. Thus, evolution, if it has taken place, could not have happened without biasing its metaphorical die with what Dembski and Marks calls “active information”. In fact as I discussed in this post even atheist evolutionists, in response to one of Dembski and Marks’ earlier papers admit that evolution must be tapping into the resources of a physical regime loaded in favour of life, most likely by the laws of physics; but from whence came our particular physical laws? For these atheists, however, this is effectively a way of shelving the problem by projecting it onto the abstruse and esoteric realm of theoretical physics where it awaits solution by a later generation of scientists. Hiding up the mystery of the origins of living things in the ultimate genesis of the laws of physics allows atheists to get on with their day to day life without any worries that some unknown intelligence could be lurking out there.
It is ironic that multiverse theory is a backhanded acknowledgement of Dembski & Marks’ essential thesis. Speculative multiverse theory assumes that in an all but infinite outer context the butter of probability is spread very thinly and evenly. Hence, because no configuration is given a probabilistic bias over any others it seems unlikely that there is an intelligence out there fixing up the gambling game in favour of life. But if we are to conclude that we can shrug off life as “just one of those chance things” probability theory requires the multiverse to be truly huge. These immense dimensions are required to offset (or "pay for" in D&M's terms) the minute probability of our own apparently “rigged” cosmic environment. That the multiverse must be so extravagantly large in order to make our highly improbable context likely to come up in the long run so to speak, is eloquent testimony of just how active Dembski’s and Marks’ active information must be if our own cosmos should in fact prove to be a one-off case.
We must acknowledge of course that in the final analysis Dembski & Marks are anti-evolution ID theorists and therefore have their sights on “evilution”. They conclude that evolution, if is to work in realistic time, must be assisted by a large dollop of active information; that is, the die must be loaded in its favour. As I see it there is no disagreeing with this general conclusion, but the irony is that just as the multiverse theorists acknowledge the concept of active information in a backhanded way, Dembski & Marks provide us with the general conditions of the scenario that evolution needs in order to succeed.
For Dembski & Marks’ paper, as have all the papers on “active information” I am aware of, provides no killer argument against evolution. In fact if anything it points to the general criteria that must be fulfilled if evolution is to work. They tell us that if evolution has happened then somewhere there is a bias, an improbable skew hidden up in the system. The first port of call in the search for this bias is, of course, the laws of physics. But as a rule, anti-evolution ID theorists will try to tell us that simple equations can’t create information. Their argument, such as I recall it, seems to be roughly based on the following lines.
1. The laws of physics are to be associated with necessity
2. Anything that is necessary has probability of 1.
4. The information content of anything with probability of 1 is zero
5. Therefore equations have no information content
6. Information is conserved. (I give this point a qualified acceptance)
7. Therefore from 5 and 6 it follows that equations can’t create information
The main fault with the above argument is point 1. Christian anti-evolution ID theorists often seem to be subliminal dualists and this inclines them to contrast the so called “natural forces” over and against the superior creator God. These inferior "natural forces" they inappropriately refer to as “chance and necessity”. The so-called “necessity” here is identified with the laws of physics or succinct mathematical stipulations in general. But such mathematical stipulations are themselves an apparent contingency and, as far as we are concerned, don’t classify as necessity; therefore they can embody information.
To see how laws can be the depository of information, it helps to understand that the quantity “information”, as Dembski and Marks use it (= - log[p] ), can only pick up high improbabilities; it is not very sensitive to the exact details of the source of those improbabilities. This is because it lumps together a field of independent probabilities into a single quantity by summing up the logarithms of those probabilities. Therefore “information” as an index cannot distinguish the difference between single probabilities and complex fields of probability. In short information is not a good measure of configuration. Thus single items of probability, if they have sufficient improbability, can embody as much information as collections or configurations of probabilities. This means that although it is true that equations don’t create information, they may yet be the depository of information in their own right if they are themselves highly improbable items; and this can be the case if those equations have been selected from a huge space of possibility over which Bernoulli’s PrOIR is applied. Equations are far from what anti-evolutionist ID theorists call “necessity”. In fact irony is piled upon irony in that we need look no further than Dembski and Marks’ paper for evidence that relatively mathematical simple stipulations may be the depository of active information.
In Dembski and Marks’ latest paper we read this:
Co-evolutionary optimization uses evolutionary tournament play among agents to evolve winning strategies and has been used impressively to identify winning strategies in both checkers [25] and chess [26]. Although co-evolution has been identified as a potential source for an evolutionary free lunch [69], strict application of Benoulli’s PrOIR suggests otherwise.
Dembski and Marks then go on to rightly observe that such systems have a built in metric of gaming success and that this “loads the die” with active information:
Claims for a possible “free lunch” in co-evolution [69] assume prior knowledge of what constitutes a win. Detailed analysis of co-evolution under the constraint of Bernoulli’s PrOIR remains an open problem.
Once again I agree with Dembski and Marks; the co-evolutionary game must be suitably loaded for complex game winning strategies to evolve. But this is to miss the giveaway point here. It seems that all one needs do is to define some succinct game playing rules, then impose a constraint in the form of some relatively simple game wining metric and then given these two constraints the “thermal agitations” of randomness are able to find those complex game winning strategies in polynomial time. A presumed corollary here is that the space of gaming strategies is reducibly complex. Thus it follows that the stipulation of the rules of the game and the selection metric are sufficient to define this reducibly complex space of game wining strategies. These stipulations have the effect of so constraining the “search space” that the “random” agitations are no longer destructive but are the dynamic that seeks out the winning solutions.
The foregoing is an example of an issue I have repeatedly raised here and have even raised on Uncommon Descent. The issue concerns the question of whether in principle it is possible to define complex structures (such as living things and games playing strategies) using relatively simple reductionist specifications and then find these specified complex structures in polynomial time. The generation in polynomial time of complex fractals and pseudo random sequences are examples of a polynomial time map from reductionist mathematical specifications to complex outputs. In the checkers game example the reductionist specifications are the gaming rules and the metric that selects successful strategies and these together define a complex set of winning game strategies that can be arrived at in polynomial time using an algorithmic dynamic which uses random “thermal agitations”. Evolutionary algorithms that identify game playing strategies are an example where simple mathematical stipulations map to complex outputs via a polynomial time algorithm. Of course, because evolutionary algorithms work for games doesn’t necessarily mean that the case is proved for biological evolution given the laws of physics, but the precedent set here certainly raise questions for me against the anti-evolutionist ID theorists tacit assumption that evolution in general is unworkable. But at the very least it seems that evolution does have an abstract “in principle” mathematical existence.
It is over this matter that have I have unfinished business with the anti-evolution ID theorists. I have failed to get satisfaction from Uncommon Descent on this matter. I agree with the general lesson that barring appeal to very speculative multiverse scenarios information can’t be created from nothing – certainly not in polynomial time. But where I have an issue with the ID pundits of Uncommon Descent is over the question of whether or not it is possible to deposit the high information we observe in living structures in reductionist specifications such as the laws of physics. I suggest that in principle simple mathematical stipulations can carry high information content for the simple reason that maps which tie together simple algorithms to complex outputs via a polynomial time link are very rare in the huge space of mathematical possibility, and thus have a low probability (assuming PrIOR) and therefore a high information content. The confusion that I see present amongst Uncommon Descent contributors is the repeated conflation of high information content with complexity; as I have said, information as a measure lumps together a configuration's improbability into one index via a logarithmic addition over its field of probabilities and thus it cannot distinguish single high improbability items from complex configurations of independent probabilities. Elemental physics may therefore be the depository of high information.
As I look at the people at the top of anti-evolution ID community like Dembski and Marks, it seems to me that the implications of their work are not fully understood by the rank file ID supporter. For that supporter, it seems, leans on his academic heroes, quite convinced that these heroes provide an intellectual bulwark against the creeping "atheist conspiracy". The impressive array of academic qualifications that these top ID theorists possess is enough, it seems, to satisfy the average ID supporter and YEC that the battle against “evilution” has been won. The very human attributes of community identification, crowd allegiance, loyalty and trust loom large in this scientific debate.
* I originally (and perhaps wrongly) attributed this paper to Dembski; it has no author name although its URL has Marks’ name in the directory path.
Yet another issue is this: When one reaches the boundary of a system with “white spaces” beyond its borders how does one evaluate the probability of the systems options and therefore the system’s information? Is probability meaningful in this contextless system? I am inclined to follow Dembski’s approach here of using equal a priori probabilities, but I am sure this approach is not beyond critique.
The choice of using Bernoulli’s PrOIR arises when one is faced with a set of possible outcomes for which there is no known reason why one outcome should be preferred over another; on this basis Bernoulli’s PrOIR then assigns equal probabilities to these outcomes. However, probability (as I proposed in my paper on probability) is a quasi-subjective variable and thus varies from observer to observer and also varies as an observer’s knowledge changes. In particular probability estimates are liable to change if knowledge of the context in which the outcomes are embedded increases. For example consider a six sided die. Initially, if we know nothing about the die then using PrOIR we may assign an equal probability of 1/6 to each side. However, if after, say, examination of the die we find it to be biased, or find that it repeatedly returns lopsided frequencies when it is thrown, then these larger contexts of observation have the effect of weighting the probabilities of the 6 possible outcomes. Thus given this outer context of observation we find that we can no longer impute equal probabilities to the six possible outcomes. But, and this is Dembski and Marks’s essential point, the cost of buying better than PrOIR results on the die is paid for by an outer system that itself has improbable lopsided probabilities. From whence comes this improbable skew on the outer system? This skew can only itself come from a wider context that itself is improbably skewed and so on. According to Dembski and Marks it is not possible, in realistically sized systems, to destroy this improbable loading. Ergo, some kind of conservation law seems to be operating, a conservation law Dembski and Marks call the conservation of information.
I believe Dembski and Marks’ essential thesis to be unassailable. Bernoulli’s PrOIR can be used to set up a null hypothesis that leads one to conclude that the extremely remote probabilities of living structures point to a kind of loaded die in favour of life. Thus, evolution, if it has taken place, could not have happened without biasing its metaphorical die with what Dembski and Marks calls “active information”. In fact as I discussed in this post even atheist evolutionists, in response to one of Dembski and Marks’ earlier papers admit that evolution must be tapping into the resources of a physical regime loaded in favour of life, most likely by the laws of physics; but from whence came our particular physical laws? For these atheists, however, this is effectively a way of shelving the problem by projecting it onto the abstruse and esoteric realm of theoretical physics where it awaits solution by a later generation of scientists. Hiding up the mystery of the origins of living things in the ultimate genesis of the laws of physics allows atheists to get on with their day to day life without any worries that some unknown intelligence could be lurking out there.
It is ironic that multiverse theory is a backhanded acknowledgement of Dembski & Marks’ essential thesis. Speculative multiverse theory assumes that in an all but infinite outer context the butter of probability is spread very thinly and evenly. Hence, because no configuration is given a probabilistic bias over any others it seems unlikely that there is an intelligence out there fixing up the gambling game in favour of life. But if we are to conclude that we can shrug off life as “just one of those chance things” probability theory requires the multiverse to be truly huge. These immense dimensions are required to offset (or "pay for" in D&M's terms) the minute probability of our own apparently “rigged” cosmic environment. That the multiverse must be so extravagantly large in order to make our highly improbable context likely to come up in the long run so to speak, is eloquent testimony of just how active Dembski’s and Marks’ active information must be if our own cosmos should in fact prove to be a one-off case.
We must acknowledge of course that in the final analysis Dembski & Marks are anti-evolution ID theorists and therefore have their sights on “evilution”. They conclude that evolution, if is to work in realistic time, must be assisted by a large dollop of active information; that is, the die must be loaded in its favour. As I see it there is no disagreeing with this general conclusion, but the irony is that just as the multiverse theorists acknowledge the concept of active information in a backhanded way, Dembski & Marks provide us with the general conditions of the scenario that evolution needs in order to succeed.
For Dembski & Marks’ paper, as have all the papers on “active information” I am aware of, provides no killer argument against evolution. In fact if anything it points to the general criteria that must be fulfilled if evolution is to work. They tell us that if evolution has happened then somewhere there is a bias, an improbable skew hidden up in the system. The first port of call in the search for this bias is, of course, the laws of physics. But as a rule, anti-evolution ID theorists will try to tell us that simple equations can’t create information. Their argument, such as I recall it, seems to be roughly based on the following lines.
1. The laws of physics are to be associated with necessity
2. Anything that is necessary has probability of 1.
4. The information content of anything with probability of 1 is zero
5. Therefore equations have no information content
6. Information is conserved. (I give this point a qualified acceptance)
7. Therefore from 5 and 6 it follows that equations can’t create information
The main fault with the above argument is point 1. Christian anti-evolution ID theorists often seem to be subliminal dualists and this inclines them to contrast the so called “natural forces” over and against the superior creator God. These inferior "natural forces" they inappropriately refer to as “chance and necessity”. The so-called “necessity” here is identified with the laws of physics or succinct mathematical stipulations in general. But such mathematical stipulations are themselves an apparent contingency and, as far as we are concerned, don’t classify as necessity; therefore they can embody information.
To see how laws can be the depository of information, it helps to understand that the quantity “information”, as Dembski and Marks use it (= - log[p] ), can only pick up high improbabilities; it is not very sensitive to the exact details of the source of those improbabilities. This is because it lumps together a field of independent probabilities into a single quantity by summing up the logarithms of those probabilities. Therefore “information” as an index cannot distinguish the difference between single probabilities and complex fields of probability. In short information is not a good measure of configuration. Thus single items of probability, if they have sufficient improbability, can embody as much information as collections or configurations of probabilities. This means that although it is true that equations don’t create information, they may yet be the depository of information in their own right if they are themselves highly improbable items; and this can be the case if those equations have been selected from a huge space of possibility over which Bernoulli’s PrOIR is applied. Equations are far from what anti-evolutionist ID theorists call “necessity”. In fact irony is piled upon irony in that we need look no further than Dembski and Marks’ paper for evidence that relatively mathematical simple stipulations may be the depository of active information.
In Dembski and Marks’ latest paper we read this:
Co-evolutionary optimization uses evolutionary tournament play among agents to evolve winning strategies and has been used impressively to identify winning strategies in both checkers [25] and chess [26]. Although co-evolution has been identified as a potential source for an evolutionary free lunch [69], strict application of Benoulli’s PrOIR suggests otherwise.
Dembski and Marks then go on to rightly observe that such systems have a built in metric of gaming success and that this “loads the die” with active information:
Claims for a possible “free lunch” in co-evolution [69] assume prior knowledge of what constitutes a win. Detailed analysis of co-evolution under the constraint of Bernoulli’s PrOIR remains an open problem.
Once again I agree with Dembski and Marks; the co-evolutionary game must be suitably loaded for complex game winning strategies to evolve. But this is to miss the giveaway point here. It seems that all one needs do is to define some succinct game playing rules, then impose a constraint in the form of some relatively simple game wining metric and then given these two constraints the “thermal agitations” of randomness are able to find those complex game winning strategies in polynomial time. A presumed corollary here is that the space of gaming strategies is reducibly complex. Thus it follows that the stipulation of the rules of the game and the selection metric are sufficient to define this reducibly complex space of game wining strategies. These stipulations have the effect of so constraining the “search space” that the “random” agitations are no longer destructive but are the dynamic that seeks out the winning solutions.
The foregoing is an example of an issue I have repeatedly raised here and have even raised on Uncommon Descent. The issue concerns the question of whether in principle it is possible to define complex structures (such as living things and games playing strategies) using relatively simple reductionist specifications and then find these specified complex structures in polynomial time. The generation in polynomial time of complex fractals and pseudo random sequences are examples of a polynomial time map from reductionist mathematical specifications to complex outputs. In the checkers game example the reductionist specifications are the gaming rules and the metric that selects successful strategies and these together define a complex set of winning game strategies that can be arrived at in polynomial time using an algorithmic dynamic which uses random “thermal agitations”. Evolutionary algorithms that identify game playing strategies are an example where simple mathematical stipulations map to complex outputs via a polynomial time algorithm. Of course, because evolutionary algorithms work for games doesn’t necessarily mean that the case is proved for biological evolution given the laws of physics, but the precedent set here certainly raise questions for me against the anti-evolutionist ID theorists tacit assumption that evolution in general is unworkable. But at the very least it seems that evolution does have an abstract “in principle” mathematical existence.
It is over this matter that have I have unfinished business with the anti-evolution ID theorists. I have failed to get satisfaction from Uncommon Descent on this matter. I agree with the general lesson that barring appeal to very speculative multiverse scenarios information can’t be created from nothing – certainly not in polynomial time. But where I have an issue with the ID pundits of Uncommon Descent is over the question of whether or not it is possible to deposit the high information we observe in living structures in reductionist specifications such as the laws of physics. I suggest that in principle simple mathematical stipulations can carry high information content for the simple reason that maps which tie together simple algorithms to complex outputs via a polynomial time link are very rare in the huge space of mathematical possibility, and thus have a low probability (assuming PrIOR) and therefore a high information content. The confusion that I see present amongst Uncommon Descent contributors is the repeated conflation of high information content with complexity; as I have said, information as a measure lumps together a configuration's improbability into one index via a logarithmic addition over its field of probabilities and thus it cannot distinguish single high improbability items from complex configurations of independent probabilities. Elemental physics may therefore be the depository of high information.
As I look at the people at the top of anti-evolution ID community like Dembski and Marks, it seems to me that the implications of their work are not fully understood by the rank file ID supporter. For that supporter, it seems, leans on his academic heroes, quite convinced that these heroes provide an intellectual bulwark against the creeping "atheist conspiracy". The impressive array of academic qualifications that these top ID theorists possess is enough, it seems, to satisfy the average ID supporter and YEC that the battle against “evilution” has been won. The very human attributes of community identification, crowd allegiance, loyalty and trust loom large in this scientific debate.
* I originally (and perhaps wrongly) attributed this paper to Dembski; it has no author name although its URL has Marks’ name in the directory path.
No comments:
Post a Comment