Sunday, June 17, 2012

The Grand Logical Hiatus

(Picture from )

 I was very interested in this post on Uncommon Descent. It quotes the response of William Dembski to the question Is information a primitive concept on a par with matter and energy? Part of Dembski’s reply is: 

I would agree that information is fundamental entity and am happy to put myself in this company. Perhaps it’s easier to take this view nowadays than in previous generations. We are awash with information. This is an information age. Moreover, we all know about information going through multiple transformations and embodiments. 
When you send an email, your fingers type at a keyboard, producing ASCII text. This is then transformed into some other symbol string so that it can be moved across the Internet without error (using error-correcting codes). Then, that information needs to be reconstituted at the other end. 
The same sorts of processes are going on in life. Information is transmitted from DNA to RNA to amino-acid sequences. It’s not just that we see alphanumeric-type items arranged sequentially in biology, but that we see transformation from one such sequence to another. Although it no longer surprises us, it should surprise us that there is such a thing as a genetic CODE. 
 Think about it—to code something is to take a character string in one form and transform it into another character string, where it can be useful in a way it wasn’t before the transformation. Alan Turing, Claude Shannon (left), and others were dealing with and developing the mathematics for such codes in the 1940s, and then, lo, in the 1950s we find that such codes are in all our cells. This is remarkable. 

 Firstly, some preamble: I think of Dembski as a moderate and reasonable evangelical Christian whose ideas need some dispassionate consideration; there is a core idea in his message that deserves some space. However, it is unfortunate that this idea is lost in the impassioned and enraged melee that the origins question has become. Worse, Dembski himself is a much admired figure head for a group of disaffected anti-academic-establishment buffs who are in the main vociferously against the story of evolution as is it is told by academia. It’s not that I am myself closed to scepticism about evolution (I follow UD because many of its posts contain worthy material), but the deep emotions and vested interests on both sides of the argument have made calm and studied detachment of the subject all but impossible.

Just what is at the bottom of this polarization and passion? I suppose that’s easily answered in one word: God. Dembski has come to represent a particular brand of intelligent design that I refer to as homunculus intelligent design. This view of ID posits an intelligent designer as a causal agent that is to be distinguished over and against natural causes: The subtext is that this designer interrupts the normal flow of causality in order to impose on matter patterns that are thought to be otherwise very unlikely to arise. It is therefore no surprise that to the angry militant atheists the appellation “intelligent design” is a technical sounding abstraction that puts a thin scientific gloss on the magical interventions of God 

Underlying this high adrenaline row is a conceptual fault line based on a folk philosophy which sees a sharp distinction between “divine causes” and “natural causes”. In this paradigm God is another causal agent to be contrasted over and against the “natural” causal agents of the physical world. Given this conceptual dichotomy it is understandable enough that the respective antagonists will want to maximise the role of either physical causes or the role of God as an alternative causal agent. For the atheists evolution is a must because it apparently fills an explanatory gap, thus helping to squeeze out God. In contrast the homunculus ID community are anxious to make the most of God’s intelligence’s role and thus are engaged in a vociferous anti-evolution campaign in order to reinstate the gap and offer God intelligence as the alternative explanation. For both sides it’s an XOR: Either “God Intelligence did it” or “natural causes did it” – the two categories are treated as if they are exclusive. In this context it is no surprise that to many an atheist, science and theology are seen to be naturally in conflict. Both sides of this conflict hold this dichotomized paradigm of “the supernatural vs. the natural” in their heads and therefore both are well set up for the hostilities we see. 

But whatever the shortcomings of the “God vs. Nature” paradigm and the demerits of the homunculus intelligent design community’s philosophy, there is, I believe, at an abstract level, a startling core truth at the heart of Dembski’s thesis. This becomes clear even if we accept the general features of the established evolutionary picture. What follows is my own rendering of this thesis. 

If the general evolutionary picture is to work then it requires that the conditional probability of life must, given the age, size and physical laws of the cosmos, be realistic. That is: 

 Prob(Evolution of life|Physical regime) ~ realistic 
Expression 1 

…where “Physical regime” may be sufficiently expressed in the physics we currently understand or perhaps supplemented with some more physics we have yet to understand; the point here is that the general idea of an evolutionary understanding of life makes the assumption that the physical regime, whatever it may be, is sufficiently resourced to have a realistic probability of generating life via so called ”self organization”  

The protagonists in our God vs. Naturalism row both hold the same philosophy (subliminally) that naturalism is a causal category which, naturally (so to speak), is in contradistinction to the divine causal category. Therefore both sides will be very interested in whether or not the value of the above probability is realistic; for this value will be highly pertinent to their decision on whether “God did it” or “Natural causes did it”.  

But it is hardly a profound observation to remark that the above expression is a conditional probability and as such leaves us with deep questions about the origin of the physical regime; who or what “did that”? In fact the above expression, even if the probability it returns is realistic, simply defers the question of origins to the mystery of the life generating efficacy of our physical regime; this question concerns not just its origin but  also the present tense continuous maintenance of this regime.

To the end of seeking to arrive at an absolute probability for life we could attempt to pose this question: 

Prob(physical regime) = ? 
 Expression 2 

As it stands this quantity is ill-defined; our physical regime is not known to be nested within a higher level physical regime with the potential of generating it with some probability. So, how can we load the above expression with coherent meaning? The obvious and popular approach is, of course, the multiverse; here we envisage a higher level physical regime with the potential of stochastically generating universes with different physical characteristics. Thus, our question is now a little better defined: 

Prob(physical regime 0 | physical regime 1) = ? 
Expression 3

….where our own physical regime is assigned the number “0” and where the higher regime in which it is nested is assigned the number “1”. This numbering scheme immediately suggest a turtles all the way down regress whereby we imagine a succession of nested physical contexts of higher and higher level. However, we could bundle this whole regress into level 1 thus embodying all the mysteries of origins (and maintenance) into this single super system. (Although if pushed this labelling procedure may give rise to a kind of Godel diagionalisation paradox, an issue which I will neglect here) 

One of the perceived advantages of this multiverse nesting trick is that speculation allows us to make the multiverse so immense that even with a uniform distribution of probabilities each and every configuration is likely to make an appearance somewhere in the huge ensemble of universes. The lack of an uneven weighting in the distribution of probability means that no special class of configurations (such as the configurations of life) are conferred with a favourable probability. Any posited bias in the probability distribution in favour of life is likely to be very unpopular amongst atheists: it would be like handing an ace card to theists! 

But even if we accept the existence of an indifferent multiverse that confers no favorable probabilities this still leaves us with at least two enigmas: 
a) The multiverse itself must be the subject of some given and particular mathematical system (Max Tegmark’s’ mathematical universe tries to get round this by positing the reification of every possible mathematical system) 
b) It seems most odd that we find ourselves in a universe whose order persists: in most mathematically possible scenarios any chance order soon decays into disorder. 

Another approach to the meaning of expression 2 above is as follows: We assume that any physical regime can be expressed as a collection of mathematical functions. We then use the Church-Turing thesis to conjecture that it is possible to express these functions as an algorithm. A relatively well defined question can then be proposed if not easily answered: Any algorithm can, presumably, be reduced to a binary pattern of 1 and 0s. Thus, one can then define the space of all possible algorithms as a space of possible binary patterns. What fraction of this set of possible algorithms will have a realistic probability of generating life in a reasonable time?* Because the configurations of life constitute a very small fraction of all possible configurations I suspect that correspondingly a very small fraction of the possible algorithms will generate life in a realistic time. (Unfortunately a proof of this conjecture is certainly not at my figure tips!) We now assume the philosophy of equal a-priori probabilities applies to the class of possible algorithms; that is, in the absence of any reason to think otherwise each possible case is assigned an equal probability. Therefore, because the class of physical regimes favoring the generation of living structures in a realistic time is likely to be an extremely small fraction in the space of all possible algorithms, then it follows that living structures have a very low absolute probability. There is, admittedly, a fair amount of (reasonable) conjecture in this argument; but it does suggest that our physical regime is a very particular one. 

The take-home-lesson from the foregoing is that given the configurations of life it is very difficult to avoid some kind of mathematical cost for these configurations: If these configurations are to be regarded as having no specially conferred probability we are forced to extend the size of the multiverse to phenomenal dimensions; but even then we are left with enigmatic questions about the giveness of the multiverse and the apparent stability of our own small corner of it. Alternatively, if we are only able to accept the existence of the universe we observe we are then faced with the extreme absolute improbability of the physical regime needed to generate living structures. We are therefore caught between two huge costs; that is, either the immense ontology of a multiverse or the immense improbability of very particular conditions. Which cost suites our theology (or anti-theology) best? When Dembski affirms that information is a primitive he is likely to have in mind the latter case: That is, if we are to reject speculative ideas about a multiverse then high improbability has to be accepted as a cosmic given. High probability equates to high levels of information and therefore information must be taken as an axiomatic given as is also, for all practical purposes, mass-energy. 

As we have seen, and in fact as Dembski himself would likely tell us, life is the product of very peculiar and special conditions; that is, it is necessarily the product of a given or "primitive" improbability. This improbability is something that is all but logically impossible to expunge from our physical models. All our physical theories are ways of describing these axiomatic improbable conditions: The laws of science are a technique of succinctly embodying information about those conditions. This information is axiomatic in the sense that it will never  attain the level of logical necessity; a grand logical hiatus will always be found in our science, a hiatus that will always surprise us and leave us in wonderment. As Dembski implies in the quote above; we may express the content of this logical hiatus in different ways and it may transform itself into different informational forms, but it never really goes away. Mystery and wonderment are not something that science is in the business of removing; it’s just that the object of our wonderment has changed with history and in modern times the mystery has taken on a new form. 

But although I am myself am inclined to swing behind Dembski on the information question I am far less inclined to support the homunculus intelligent design school of thought. In part this is because I believe the “Natural causes did it vs. God did it” paradigm is a false dichotomy. Those who hold on to this dichotomy are, I submit, still clinging to a notion of the mystery of origins as it has been expressed in times past.

* If Wolfram's "computational equivalence" conjecture is right then all algorithms that generate complex output will, if given enough time, visit every computation. Hence with an eye on Wolfram I have added the qualification "realistic time".