Friday, June 25, 2010

Back of the Envelop Mathematical Model Number 2: Bayes' Theorem.


Bayes; a Man of the Cloth.

Some years ago I came across an argument for the existence of God that used Bayes' Theorem. In fact the theorem seems to have been associated with arguments for the existence of God from day one; for example see the end of the wiki page on Bayes' Theorem. Before I can consider this argument we first need to get a handle on the theorem.

A straight forward derivation of Bayes' Theorem can be provided using a frequency understanding of profanity probability. To this end we imagine we have a total of T items. Let these T items have properties A and B distributed over them. Let the number of items with property A be F(A) and the number of items with property B be F(B).

Now, there may be some items that possess both properties A and B. If we take the set of items that possess property B, then let the number of items in this set possessing property A be represented by F(A|B). If we now take the set of items that possess properties A, then let the number of items in this set possessing property B be represented by F(B|A).

These quantities can be pictured using the following Venn diagram:

One fairly obvious deduction is that:

F(A|B) = F(B|A)

Thus, given these quantities we now imagine that we throw the T items into a bag, agitate them, and then select an item. If we assume a priori equal probabilities, then using the Laplacian definition of probability we then have probabilities P(A), P(B), P(A|B) and P(B|A) and Bayes Theorem follows as below:



The latter expression is Bayes' Theorem. Notice the symmetry of the proof and moreover the symmetry of Bayes equation; as one would expect given that A and B are interchangeable. I have arranged the derivation in order to bring out this symmetry.

Friday, June 18, 2010

Polarisation, Polarisation, Polarisation!

Some of us are non-plussed by an escalation in attempts to construct water-tight arguments

I have only just seen this post on Uncommon Descent, a post which came out in April. I am very gratified by the generally conciliatory tone of the post. The post was a response to Karl Giberson of Biologos who the post claims has “offered an Olive Branch to ID”. Biologos is the Christian think tank founded by Francis Collins, an affiliation that identifies itself with Theistic Evolution. In turn Thomas Cudworth, who wrote the post for Uncommon Descent, makes some very positive overtures. Moreover also he makes some significant statements:

William Dembski, for example, has said that ID does not in itself require gaps in natural causality, and therefore does not rule out a wholly natural process of evolutionary change, provided that the natural process is understood not as chance-driven but as designed. (I really wish Dembski would make his position here clearer - TVR)

ID is compatible with the view that new information was “input” into the biological realm at various points, e.g., at the origin of life, during the Cambrian explosion, at the origin of man, etc. But it does not require such a view.

All that ID insists upon is that the integrated complexity observed in biological systems is in important respects the product of design. The means by which the design was brought into nature – via a front-loaded scheme that operates naturalistically, via subtle indetectable steering, or via “spot miracles” to fill in the “gaps” – are not part of design theory per se.

It is precisely because ID focuses on design detection that it is compatible with many different views of how biological order emerged. Thus, ID is a “big tent”, which can embrace people who don’t accept macroevolution at all, and also those who accept chemical and biological evolution “from molecules to man”. What unites all ID proponents is not a particular account of origins, but a rejection of non-design in favor of design.

TE as such says nothing either way about the detectability of design, and ID as such says nothing either way about the occurrence or non-occurrence of evolution. Therefore, neither group needs, on definitional grounds, to deny the core belief of the other.
It is only those ID people who insist on rejecting evolution on principle, and only those TE people who insist that God’s design must not be detectable, that have no common ground.

In sum, I second Dr. Giberson’s proposal for a new exploration of what ID and TE have in common, rather than what separates them. I think the best TEs and the best ID people have a strong sense of design in nature that is not merely a subjective impression but which points beyond nature to a designer of nature.

Sounds, good to me; it seems we have here a basis for reconciliation between Uncommon Descent and Biologos. So why can’t they get on? The answer, according to Cudworth: Polarisation, Polarization, Polarization!:

But in between, there is an overlap zone, which I think that neither TEs nor ID people have fully explored, because of reckless past charges on both sides which have generated great mutual distrust.

And there is no better example of polarization and mutual distrust to be found than in this recent post on UD where it’s back to the belligerent business of evolutionists slugging it out with anti-evolutionists. Cudworth’s post, it seems, is already water under the bridge, soon forgotten.

Part of the deeper problem here may be the hold the philosophy of deism has on our culture. My theory is that both evolutionists and anti-evolutionists have to consciously resist a compulsion toward a default deism. This philosophy suggests that if the patterns displayed by the cosmos are mathematically describable then it follows that they are autonomous “mechanisms” in need of no maintenance and sustenance. Ironically even the most out and out anti-evolutionist seems hold this view: In consequence (s)he has a pressing spiritual need to propose divine “interventions” in order to prove the presence of God in what (s)he otherwise percieves to be a self sufficient machine world of mathematical pattern.

STOP PRESS 20/6/10
In a post headed “Theistic atheism –oops I meant theistic evolution” Denise O’Leary quotes a fellow anti-evolutionist journalist:

To put it in terms of an equation, when atheists assure us that matter + evolution + 0 = all living things, and then theistic evolutionists answer, no, that matter + evolution + God = all living things, it will not take long for unbelievers to conclude that, therefore, God = 0.”

She goes onto say:

Right, exactly, that is the project of “theistic” evolution, so far as I can see. Helping theists get used to a world run by atheists and their values, while still hollering fer Jesus irrelevantly somewhere.

All good polarizing stuff that’s bound to further alienate the theistic evolutionists, thus illustrating my point. Moreover, we have a fine expression of the essentially deistic paradigm that some people are working to; namely the “equation” : matter + evolution + God = all living things. Note that this portrayal of the theistic evolutionist’s position conceives God’s work to be something different from “natural forces” represented by “matter + evolution”. The Christian anti-evolutionist, who may still have a subliminal belief that “natural forces” have at least a nominal autonomy reads “evolution + matter” as an upstart usurper of God’s sovereignty. (S)he thus wishes to minimize the “natural” terms in the equation by putting evolution+matter = nothing very much. But this is qualitatively little different from atheists setting God = 0. Deists and interventionists hold a common philosophy that is only different in degree but not in quality.

As I once said on this blog: It is ironic that those who are so vocal about believing in "interventions" support a philosophy that has a close relation with deism: "N interventions" very easily turns into "Zero interventions" when faith falls away and N slides toward zero. If we have to use these mathematical metaphors a better one to my mind than these simple linear equations is to think of God as a “zero point” operator i.e. G(0) = natural history.



Business as usual on the polaris front

Wednesday, June 16, 2010

Problems in Young Earth Creationism: Star Light Travel Time

Young Earth Creationists have an astronomical problem on their hands; to say the least!

Young Earth Creationism insists that the Earth is no more than 10,000 years old; in fact the version of Creationism represented by the web site Answers in Genesis suggests an even shorter time span of around 6000 years. This view, needless to say, immediately creates an astronomical issue: Given that the stars and galaxies have distances well in excess of 10,000 light years Answers in Genesis need to explain how the light from distant stars could have reached the Earth in less than 10,000 years.

Answers in Genesis, to their credit, do at least attempt to tackle such problems. In contrast I think you will find that there are some hyper-fundamentalist Christian groups out there who take it for granted that their own interpretations of the Bible are the very words of God and for them it’s a case of “just believe” and hang any difficulties with the profanities of science. Thus, any move to probe their interpretations is regarded as an affront, tantamount to blasphemy. It is impossible to engage with these cognitively quarantined Christians. At least AiG are not in this category; they are prepared to dabble in some science. Let’s be thankful for small mercies.

The contemporary Young Earth Creationist movement owes much to the publication of the book “The Genesis Flood” by John Whitcomb and Henry Morris in 1961. It is in this volume that we find an early attempt to address the problem of star light. On page 369 the plain interpretation that light from distant stars and galaxies must have taken far longer than 10,000 years to reach us is referred to as a “contention”:

But this contention of course begs the question. It constitutes an implicit denial that the universe could have been created as functioning entity. If creation has occurred at all then it is reasonable that it would be a complete creation. It must have had an “appearance of age” at the moment of creation. The photons of light energy were created at the same instant as the stars from which they were apparently derived, so that an observer on the Earth would have been able to see the most distant stars within his vision at that instant of creation.

In other words according to Whitcomb and Morris photons of light were created in mid flight, giving the false impression that they originated from those distant cosmic objects. Perhaps sensing that this concept compromises the integrity of God’s created work Whitcomb and Morris hedge by mooting another idea that I have seen bandied about in Young Earth Creationist circles; namely, variable speed of light or “VSL”. Whitcomb and Morris promote a paper that questions special relativity and conclude that because such ideas can be submitted in scientific circles then this demonstrates:

… that astronomy has nothing really definite as yet to say about the age of the universe.

The approach here is the tried and tested science of negation, a method very common to the YEC movement; do all you can to muddy the waters of science by attempting to show that it is all speculative, inconsistent and highfalutin theory and then leave the scene, returning uncritically to your dogmatic interpretations of Biblical texts. If all else fails you can always retreat into the view that the Cosmos was conjured up as seen, “just like that”, 6000 years ago, a view that is all but irrefutable; for any observations to the contrary can be put down to mere appearances.

Of course it is very easy to take the view that science, which necessarily embodies the use of working assumptions and/or postulations whose veracity makes the world providentially knowable, cannot be trusted to tell us anything. This negative approach is unlikely to be applied by YECs to their Biblical interpretations, interpretations which also necessarily rest on the working assumptions needed to make the Bible comprehensible. However, when YECs turn from negative criticism to positive theory construction we can apply similar standards of criticism to their work. Therefore it was with great interest that I found an attempt on AiG to provide a scientific rationalization of the star light problem. (See here and here).

Firstly, it seems that AiG have moved away from the idea of a variable speed of light. On this page on their web site they warn Christians about using “far out claims”, one of those “far out claims” being the very idea that the speed of light has drastically changed:

We can speculate about a large-scale change of light speed in the past, but evidence is lacking. In addition, any alteration of light speed would affect several other constants of nature, but evidence of these changes is also lacking.

Young Earth Creationists always have the option of opting out of science by declaring that God conjured up the Cosmos, as seen, “just like that”, 6000 years ago. Such a concept is not very amenable to scientific probing and it is therefore no surprise that YECs who hold such views are not taken seriously. But it seems that the YECs at AiG want to show they are willing to put their name to at least some semblance of scientific inquiry. So rather than appear to be just another insular religious ghetto of fideist ultras they are prepared to get out on their bikes and look for some scientific kudos. Therefore on the links I have provided, AiG put forward for consideration the ideas of Russ Humphreys (See picture above). Make no mistake about it, this is dangerous ground for AiG; it means that they are coming out from under the cover of ad hoc creative conjuring into the line of fire of scientific inquiry. No wonder Christian fideists proactively rail against this sort of activity as an unspiritual courting of worldly science. AiG have a lot to lose.

In a later post I’ll have look at Humphreys’ ideas.

Sunday, June 13, 2010

I’m Checking this One Out

I have been following with great interest the posts on Robert Sheldon’s blog, a blog found on the neo-conservative web site Townhall.com. Sheldon’s ideas about OOL interest me and I have added comments to some of his posts. Some of those comments can be found here, here and here.

Sheldon’s right wing sympathies also interest me as they call attention to the strange complimentary inversion of philosophies that exist between the far right and far left: Those on the far left are likely to believe that the decentralised algorithms controlling matter, contrived with no apparent foresight, have created the organized complexities of life; but they are less inclined to allow decentralized processes to rule the socio-political realm where instead centralised “big government” planning is preferred. In contrast those of the far right often incline toward the view that very non-local processes, processes with foresight and intelligence, have planned and created life; but they do not want to see a centralized planner in the realm of the socio-political and instead believe that the localized unplanned decisions of the market place lead to an ordered and free society.

Sheldon’s take is particularly interesting because it is satisfyingly anomalous, idiosyncratic and exotic. Moreover it shatters the mold: His position on the creation of life looks to be intermediate: If we could quantify intelligence with a quantity I shall call “E” then I interpret Sheldon’s theory to be the equivalent of postulating that “evolution” can make jumps of intermediate value of E. This entails a process of “evolution” that would look a little bit like human history, where the limited quantum of human intelligence beats a path through time. Sheldon might just possibly be onto something.

Sunday, June 06, 2010

Dembski, McIntosh and “Evilution”.

There is a rumour that the Flat Earth society have sided with William Dembski.

This post on Uncommon Descent publishes the abstract of a paper by Andy McIntosh. The paper, according to William Dembski, is about the “thermodynamic barriers” to Darwinian evolution. Here are my comments on various parts of the abstract:

McIntosh: The theory of evolution postulates that random mutations and natural selection can increase genetic information over successive generations.

My Comment: No. Assuming the concept of “information” here is similar to that used by Dembski (that is –log[p]) then evolution, as currently concieved is not a process that creates of information. As I have tried to show in this blog Evolution converts information from one form of information into another: It does not pretend to transform an absolute improbability into a realistic probability. Being an anti-evolutionist McIntosh is likely to be anxious to cast “evilution” into the mold of a pretender to the throne of information creator.

McIntosh: By [evolution] it is proposed that a particular system can become organised at the expense of an increase in entropy elsewhere. However, whilst this argument works for structures such as snowflakes that are formed by natural forces, it does not work for genetic information because the information system is composed of machinery which requires precise and non-spontaneous raised free energy levels – and crystals like snowflakes have zero free energy as the phase transition occurs.

My Comment: No. A raised free energy is not in and of itself relevant as a road block to evolution. Osmosis is a process driven by an increase in entropy and yet it is capable of locally raising the free energy of a column of liquid in as much is it may become higher and thus contain more potential energy. Much more pertinent than these raised free energies is the fact that the configurations of genetic information are, when placed in the right biological context, “meaningful”. As we know purely natural forces ensure that a growing organism annexes increasing amounts of material into a state of raised free energy. In this case the information is transformed from preexisting biopolymers into the newly annexed material. Thus, the crucial issue is NOT raised free energies as such but the origin of the information that succeeds in propagating from one form of reification to another.

McIntosh: …. biological structures contain coded instructions which, as is shown in this paper, are not defined by the matter and energy of the molecules carrying this information. Thus, the specified complexity cannot be created by natural forces even in conditions far from equilibrium. The genetic information needed to code for complex structures like proteins actually requires information which organises the natural forces surrounding it and not the other way around – the information is crucially not defined by the material on which it sits.

My Comment: No. Once again I see hints of the common view of anti-evolutionists to expect to see information reified on some material substrate and thus they wrongly conclude that because they can’t find such a reification then that information doesn’t exist in our divinely selected cosmic set up. Anti-evolutionists refer to the processes of that cosmic set up as “natural forces”, a term they use pejoratively in the sense of “mere natural forces”. Trouble is, natural forces generate life every time life is propagated in the womb or egg. The issue is not whether natural forces can generate life, since clearly they can, but just where these natural forces get their information from to do so: As I have suggested in this blog this information could conceivably be found in the platonic world of configuration space and the selection of the function T(L). ( See my last but one blog )

McIntosh: The fundamental laws of thermodynamics show that entropy reduction which can occur naturally in non-isolated systems is not a sufficient argument to explain the origin of either biological machinery or genetic information that is inextricably intertwined with it.

My Comment: Yes, I completely agree. But contrary wise, the law of overall increasing entropy is not sufficient argument to eliminate evolution from the inquiry. This is because entropy is a measure of the number of microstates consistent with a macrostate. It is therefore a quantity that is measured given the ordering effects of the physical laws constraining the system and therefore it is not very sensitive to measures of absolute disorder; a system can move to its most disordered state in terms of entropy, and yet because of physical constraints, still be a very ordered system in absolute terms. It is ironic that it is precisely the specter of an intelligent designer lurking behind the cosmic scene that prevents the elimination of evolution from this inquiry: The cosmic set up could be a cleverly selected design option of an intelligence capable of conceiving the abstract function T(L).

McIntosh: This paper highlights the distinctive and non-material nature of information and its relationship with matter, energy and natural forces. It is proposed in conclusion that it is the non-material information (transcendent to the matter and energy) that is actually itself constraining the local thermodynamics to be in ordered disequilibrium and with specified raised free energy levels necessary for the molecular and cellular machinery to operate.

My Comment: Yes and no. ‘Yes’ because evolutionary information may be located in the immaterial and abstract space pertaining to the function T(L). It is this function which may constrain thermodynamics to such a degree that increasing entropy (which ironically becomes the diffusional dynamic that hunts for life) is capable of not only locally raising free energy levels (as in osmosis) but of also of raising the probability of the right configurations. The function T(L) transcends the decay of the second law. ‘No’ because, once again, McIntosh introduces the same lame materialistic aunt sally of “natural forces” as the “informationless” elemental pretender to the Creator’s crown. McIntosh and his fellow anti-evolutionists can therefore picture themselves as crusaders fighting the good fight against the straw man of “naturalism”.

Summing up Comments: From McIntosh's abstract alone things are not looking good for his paper . I’m not convinced that people like Dembski and Mcintosh are able to eliminate status-quo evolution from the inquiry for the right reasons. Instead they continue to try to short cut the issue by attempting to cast the debate into the dualistic mold of “natural forces” vs. the Creator; an approach that fails to do justice to the fact that those natural forces are themselves the Creator’s creation. I suspect that if I read McIntosh’s paper I would find nothing technically wrong it: The devil is likely to be in a subliminal dualistic “naturalism vs. creation” spin that he would impose on the argument, a spin we can start to detect in the abstract. Let me make it clear that none of this necessarily means that evolution is actually a viable process given the particular physical regime of our cosmic theatre – the constraint embodied in T(L) may not be sufficiently great for the computation engine of entropic diffusion to return a realistic probability for the formation of life.

Many ID theorists are very immersed in the politics of the evolution vs. anti-evolution debate and from their embattled perspective "Intelligent Design" and "anti-evolutionism" are interchangeable terms. For them “evolution” and intelligent design are an irreconcilable dichotomy and this in turn is a reflection of a polarized conceptual filter that sees the whole debate through a "naturalism vs. creator" paradigm. Therefore I have grave doubts about the ability of McIntosh and Dembski to handle the philosophical and theological issues correctly. But for some followers lacking in expertise Dembski and McIntosh are the people Christians are supposed to get behind. In this thread on Network Norwich & Norfolk an anti-evolutionist correspondent started pushing McIntosh’s view on Thermodynamics. (see towards end of the thread). The said correspondent clearly had little understanding of the subject and presumably naively thought McIntosh had enough authority to end the argument, which of course he doesn’t. This correspondent was, I believe, a young earth creationist; evidence that the anti-evolutionists are a broad and uneasy coalition of young earth and old earth Christians, perhaps united by the abuse they get from the scientific establishment. It is not surprising, then, that when one of their number succeeds in getting a peer reviewed paper published they hang out the flags.

This reverend gentleman is clearly not connecting with his passions; he's far too balanced

Polarisation passion feeds. Passion polarisation breeds. Polarisation is passion's cause, for crusade and holy wars.


STOP PRESS
Interesting is this post by Denise O'leary on Uncommon Descent where she accuses Biologos, the Christian science think tank founded by Francis Collins thus:

I suggest that the real action now is with groups like Biologos, aimed explicitly at persuading Christians that they can maintain an intellectually respectable faith without believing that the universe (?) or life forms show evidence of design.

As I said above "For them evolution and intelligent design are an irreconcilable dichotomy...". Christians who believe in evolution now stand accused by O'leary of not giving the Divine designer credit and instead of believing in "natural forces". Little wonder that I am so loathe to side with these anti-evolutionists; in fact, how can one side with them when there is a real danger of being spiritually incriminated in this way?

Tuesday, June 01, 2010

Mathematical Mysteries


Someone emailed me asking if I could make sense of a couple of mathematically oriented matters.
The first was the strange pervasive quality of the “Golden Number”, as exemplified by this YouTube piece:
http://www.youtube.com/watch?v=PjrK96wasDk
The second was some of the amazing symmetries and regularities one can find in arithmetic operations. For example:
1 x 8 + 1 = 9
12 x 8 + 2 = 98
123 x 8 + 3 = 987
1234 x 8 + 4 = 9876
12345 x 8 + 5 = 98765
123456 x 8 + 6 = 987654
1234567 x 8 + 7 = 9876543
12345678 x 8 + 8 = 98765432
123456789 x 8 + 9 = 987654321
Here are my comments on the issues raised:

1. The Golden Number:
I pondered this question as far back as 1968 when my maths teacher gave me a colourful coffee table book on maths. It had a section on the Golden ratio and the amazing number of seemingly disconnected connections in which it arose. Why was this theme running through nature?
The Golden number arises where ever one finds the following quadratic equation:
x^2 - x -1 = 0 (read x^2 as “x squared”)
This equation returns a solution for x of value G, where G is the “golden number” = 1.618….etc. Thus any phenomenon where this equation is entailed will return a golden number. This immediately explains why the value of G appears in some contexts, Viz:
1. For example, a requirement of snail shells is that during growth of the shell the snail need not adapt to any fundamental changes in the distribution of the weight of the shell – hence, big snails look exactly like small snails; that is snail shells are “self similar”. This requirement, it can be shown, entails the above quadratic and therefore the value of G.
2. Another example is the proportions of A4 paper. These proportions are such as to fulfill the requirement that cutting a piece of paper in half produces two pieces with the same proportion as the original. Once again, given this requirement of proportionate similarity, the same quadratic arises, thus entailing G.
3. It is also clear why the Fibonacci series returns increasingly better approximations for G as the series progresses In this series we require n1+n2 = n3. That is, we generate the next number n3 in the series by adding the two previous numbers in the series. Now, we are interested in the ratio of n3 to n2; in other words the value n3/n2, because this value, as the series proceeds, tends toward G.
Now: n3/n2 = (n1 + n2) / n2, by the definition of the Fibonacci series.
But: (n1 + n2)/ n2 = 1 + n1/n2
Thus: n3/n2 = 1 + n1/n2
Or: n3/n2 = 1 + 1 / (n2/n1)
In this latter equation n3/n2 is a better approximation to G than n2/n1. This means that the left hand side of the last equation converges to G as the series advances. Therefore, for large numbers n3/n2 ~ n2/n1. Therefore we can write the last equation as:
G = 1 + 1/G
Which transforms to the quadratic we are looking for:
G^2 – G – 1 = 0.
The ubiquity of G, or rather the quadratic x^2 – x – 1 = 0, seems to be on par with other themes running through nature, like the Boltzmann distribution, the Gaussian bell curve, the power law, and even the value of pi (see my recent blog on the subject here http://quantumnonlinearity.blogspot.com/2010/05/back-of-envelop-mathematical-model.html). So if the ubiquity of G is mysterious then so are these other mathematical forms. We trace back the existence of these mathematical themes to their origins in very similar starting conditions, conditions which thus lead to isomorphic mathematical outcomes. But once we spot the isomorphic underlying conditions the existence of the theme seems less mysterious. For example, random walk underlies the bell curve; consequently where ever we find randomness we will not be surprised to find a Gaussian bell curve. Likewise, any phenomenon governed by the quadratic x^2 – x – 1 = 0 will give us a golden ratio. But having said that I have to admit that in the case of G, the underlying isomorphism’s are often not very clear and thus the presence of G can be rather mysterious; for example I don’t know why the genetic algorithm generates logarithmic spiral patterns in plant flowers – perhaps it something to do with growth self-similarity as in the snail shell. Perhaps a biologist could tell us.
Actually, one might claim that randomness (which is implicated in the Gaussian bell curve) is itself a very mysterious phenomenon: Koestler put it well in his book “The Roots of Coincidence”. After remarking on the remarkable fact that such diverse things ranging from nineteenth century German soldiers being lethally kicked by their horses to “Dog bites man” reports in New York, all obey the same statistical curves, Koestler goes on to say:
But does it (probability theory) really lead to an understanding? How do those German Army horses adjust the frequency of their lethal kicks to the requirements of the Poisson equation? How do the dogs in New York know that their daily ration of biting is exhausted? How does the roulette ball know that in the long run zero must come up once in thirty-seven times, if the casino is to be kept going? The soothing explanation that the countless minute influences on horses, dogs or roulette balls must in the long run "cancel out", is in fact begging the question. It cannot answer the hoary paradox resulting from the fact that the outcome of the croupier's throw is not causally related to the outcome of previous throws: that if red came up twenty-eight times in a row (which, I believe, is the longest series ever recorded), the chances of it coming up yet once more are still fifty-fifty.
It was this passage that help spur me into studying randomness so intensely in the 1970s.
In my opinion whichever way we look we find a profound mystery; namely, the profound mystery that contingent things – that is, things with seemingly no logical necessity or aseity – simply exist. The issue I have with the video you linked to is that in common with much transatlantic “Intelligent Design” thinking, only some phenomena are singled out and identified as betraying signs of design; the insinuation is that some things are somehow “natural” and therefore do not need Divine design, creation and sustenance. But to me it’s either all “natural” or its all “supernatural” – I’ll take the risk and plump for the latter as the sheer contingency of existence, no matter how mathematically patterned, will forever remain utterly logically unwarranted (what I refer to as the great “Logical Hiatus”), and thus in my view a revelation of God’s creative providence.
The problem we often have with mathematics is that sometimes it looks like a kind of magic: It reminds me of that “think of a number” game where somebody in due course appears to “conjure up” the number you first thought of – it may at first seem that some sort of underlying magic is at work but in fact it is the underlying logic unperceived by us whose consequences surprise us. Likewise, the ubiquity of G at first looks very mysterious, as if some divine magician has come along and tweaked the universe here and there thus betraying his presence by way of this magic number. It’s all very reminiscent of Arthur C Clarke’s “Tycho Magnetic Anomaly” in 2001 Space Odyssey, where a mysterious and yet clearly artificial object is found on the Moon; something very “unnatural” is discovered in a vista of “naturalness”. This 2001 scenario is basically the transatlantic paradigm of intelligence design; that is of a God who is manifest largely in various inexplicable logical discontinuities found in the cosmos. But in my opinion this is hardly the right model for God: The Judeo-Christian view of God is not one of an entity who is just responsible for the overtly anomalous: For as soon as we recognise the underlying and unifying mathematical logic at work in the cosmos we no longer see a patchwork creation punctuated by arbitrary “supernatural” anomalies. Instead these underlying logical themes open out into an even bigger mystery; the mystery of a much more totalizing kind of intelligence; an entity of an entirely different genus to the part-time intervening tinkerer who is on a par with alien beings.

2. Mathematics as a Human a Construction?
I’m not so sure that it is right to think about mathematics as a purely human invention. I’m slanting toward the view that “invention” is kind of “discovery” and vice versa. In platonic space every structure/configuration exists in the sense of being a possibility awaiting realisation – it is then down to us to drag that structure out of potentiality and to reify it in material form as either symbols on paper, a computer program, or as concepts in our head.
In fact consider the case of the story writer. As I have said before an average size book has about 30 to the power of a million possible arrangements of text characters. Think of the creative act of story invention as one of dragging out into the real world a single combination of text from those many possible combinations! Achieving such a feat presupposes that the combinatorial hardware exists to take up one of those many possible platonic states. Thus, the “invention”, “discovery” or “reification” of one of those combinations is conditional upon a pre-existing substrate and is thus very much bound up with the “hardware” providence has supplied in the first place in order that these platonic entities may be reified.

3. Regularity and Symmetry in Mathematical Patterns.
Here we have some mathematical operations that have produced patterns of symmetry and regularity – in other words patterns of “high order”. Now let’s think about the opposite; namely mathematical operations that generate disorder. To this end take a number like 2455, square it and then take out the middle digit. Take the first four digits of the square and square that. Once again select the middle digit. Repeat the process many times. One finds that the sequence of selected digits looks very much like a random sequence of numbers (Probably only approximately so) with no discernable pattern. Hence, mathematical operations can generate both order and disorder.
I think of mathematical operations as physical systems, systems that manipulate tokens; basically they are algorithms that can be programmed on a computer. I’m not sure whether or not this algorithmic view of maths is valid for some of the more abstract meta-mathematical thoughts (Like Godel’s theorem), but it is clear that a large class of mathematics comes under the heading of algorithmics and certainly the examples you give are algorithmic in as much as they could be generated on a computer with the right programming.
Anyway, in the context of algorithmic mathematics the ordered patterns you observe make sense to me. One way of making sense of these patterns is to start with the observed patterns themselves rather than the process that generates them. These patterns are highly organized. Highly organised patterns are readily compressible because they don’t display the variety and complexity of disordered patterns and thus they don’t contain a large amount of data. Hence they can be “compressed” by expressing them as simple rules of generation. Expressing such a pattern as the product of some kind of rule based mathematical procedure is a way of compressing that pattern. It is therefore not surprising that elementary arithmetic procedures generate highly organized patterns because those procedures are in effect ways of compressing the ordered patterns they generate. Thus, the symmetry and order you observe is really not surprising. What I have more difficulty with is the fact that simple arithmetic procedures, as we have seen, can also “compress” some disordered sequences in sense that they can generate disordered patterns in a relatively short time. This actually created a paradox for me and I proposed a solution in one of my blog entries – see this link: http://quantumnonlinearity.blogspot.com/2009/10/proposed-problem-solution.html