Wednesday, March 26, 2025

The Sea of Faith and Don Cupitt. Part II

 (This post is still undergoing correction and enhancement)

The post enlightenment sea of faith

In part I of this series I described how in the first episode of his Sea of Faith series atheist theologian Don Cupitt was beguiled (like many others, such as the North American Intelligent Design community) by the "natural forces" vs "intelligent design" dichotomy. For Cupitt the inference was that natural law is just that, namely "natural" and therefore a process with its own internal (clockwork?) dynamic thus dispensing with the need for a prime divine mover either as an initiator and/or as the sustainer of natural processes. Cupitt fell for the tempting and popularist reaction which sees in high organization a self-sufficient and self-sustaining machine.

Cupitt's response contrasts with the classical science thinkers such as Descartes, Galileo, Newton, Faraday and Maxwell whose faith underwrote their successful attempts to unlock the secrets of the breathtaking cosmic order, an order believed by Christians to be created and sustained by God everywhere and everywhen. In contrast, as I wrote in Part I, Cupitt's view was as follows: 

According to Cupitt, in Galileo's dynamic vision of the universe motion was "built-in" and therefore it was "no longer necessary to appeal to the action of a divine mover who keeps that universe energized".

For Cupitt any thought that cosmic organization is fuel to the fire of theism is to repeat the errors of medieval superstition & magic. After all, isn't the lesson we learn from our own technology (steam engines, clockwork, computers etc) that they run by their own internal impetus, volition and logic without the need for human intervention and least of all without any need for prayer or magical rituals to keep them going? It is a very natural intuitive reaction, therefore, to read into any highly organized dynamic the presence of an internal self-motivating sufficiency.  

However, Cupitt is undoubtedly right about one thing: The discovery of the mechanical universe which (superficially at least) appears to be oblivious to prayers and magical rituals is one reason (among others, no doubt) for the recession of the Sea of Faith. Consequently, the feeling is that science is an escape from the need to control by religious intervention and/or magic. Therefore science should evoke no mystical response. But there is an historical paradox here however; for it seems likely that the very organization of the heavenly motions and the correlation with the beat of the earthly seasons are implicated as one of the motivations behind the construction of the neolithic stone structures and it seems unlikely that these were mere stone computers absent of mystical significance.  

***

In the first episode of the Sea of Faith Cupitt went even further with his undermining of the meaning of the scientific project. In the following quote he misrepresents the scientific quest and underrates any notion of scientific progress...

.....yesterday's orthodoxy we see now is today's heresy. Today's orthodoxy will be laughed at tomorrow. All theories have a limited life. There are no fixed positions anymore.  They'll all crumble and be replaced by others. The truth now is not in the fixed positions, it's in the quest. 

That statement fails to do the slightest justice to the scientific project as I've come to grasp it. As algorithmic models of the creation, the theories of classical science are still taught in universities, not as irrelevant historical relics of a by-gone past but as excellent metaphors and approximations, object lessons in the startling way science makes miraculous progress. The classical theories still describe much about the cosmic order and were the foundations for later theories (cf. the lagrangian formulation of classic mechanics and the later canonical quantum theory). They also typify the breathtaking way various mathematical strands of thought evolved and came together to encapsulate the cosmic dynamic (Faraday and Maxwell, complex numbers and quantum theory). They were steps along the progressive quest of the highly successful physical sciences project. Schrodinger and Heisenberg advanced their own metaphors for the understanding of the quantum world, metaphors of the same descriptive isomorphic power, but it would be wrong at this stage in our understanding to claim that one metaphor has crumbled to be replaced by the other; both stand. 

There is, of course, still some way to go as the physical science project progressively readjusts our theoretical metaphors to fit the data-points of our experience. But what we currently have nevertheless captures the God ordained cosmic organization to a high level of virtuosity and artistry. Contrast this progressive, grand and heroic project against Cupitt's statement quoted above; this is a crass postmodern undermining of the true story of scientific progress toward better and better mathematical understandings of the world. 

***

But to be fair to Cupitt, he well describes the psychological and cultural knock-on effect of the discovery of the mechanical universe. The new perspective left a sense of cosmic alienation and desolation, not least because, to cap it all, the temple of the earth had been displaced from its Ptolemaic center by Copernicanism.  Cupitt puts it eloquently as follows:

Science became a kind of abstract diagram of nature. But when the universe is seen in this way it no longer looks so friendly to man. It doesn’t give him guidelines in the old way. It’s stripped of its old religious and moral significance. It’s god, if any, is a cosmic mathematician rather than a heavenly king and father. How would faith respond to the bleakness of this new vision of the universe?

....I think we know the answer to that last question; not very well it seems; at least in the West where the new science first took hold. However, for some philosophers at least a loss of faith wasn't a necessary  outcome. According to Cupitt Descartes, who he calls an uncompromising rationalist... (My emphasis)

.... proved God’s existence by abstract arguments and then used God to certify the validity of human reason and the existence of the mechanical universe. After that science took over. 

Cupitt goes on to contrast Descartes's faith with that of the very feeling and personable philosopher Blaise Pascal of whom Cupitt says....

Publicly Pascal was a gifted and sociable man with hundreds of friends and correspondents.....

But the highly intuitive Pascal was far from satisfied by Descartes's God whom Pascal saw as promoting a deist God....

Pascal who was an intensely Christian personality such lip service to religion was abhorrent. “I cannot forgive Descartes. In his whole philosophy he would like to do without God but he couldn’t help allowing him a flick of the fingers to set the world in motion. After that he had no more use for God”. That metaphysical God the God of the philosophers was not the God Pascal was privately seeking. 

And it wasn't only Pascal...

The effect of the new discoveries had been to break down people’s traditional sense of their place in the universe. People felt like aliens literally displaced persons. They were surrounded by giddying new vistas of greatness and littleness. In Pascal's mind this sense of exile came together with his Christian understanding of sin, paradise lost, man’s need for salvation, the contradictions of human nature.

In private Pascal was full of angst; for him human existence was characterized as inconstancy, boredom, anxiety. Pascal marveled at the vast complexity and beauty of the universe at both its large and small scales. And yet the size and dumbness of the cosmos filled Pascal with dread & terror. Man was lost in this tiny and insignificant corner of the universe and his purpose utterly unclear.


***

Let me now turn to what to me was the startling aspect in the first episode of the Sea of Faith. In part 1 of this two part series I wrote this...

"At the time it would have been easy for me to write-off Cupitt as just another pundit presenting an all too typically hackneyed misrepresentation of science and then forgotten all about him. But as it turned out his reaction to his own passe concepts was to weigh strangely in the scales of my own thinking. A few years after I had watched the series (I had also purchased the book) I was making heavy weather of some of the gnostic-like aspects of contemporary Christian evangelicalism.  To my surprise I found that Cupitt had given me insight into the condition behind these circumstances. It was ironic that Cupitt's reaction to the elegant intellectualisms of science had parallels in contemporary evangelical Christianity: Evangelicalism's own version of the reactionary existential angst triggered by the apparently soulless and profane mechanical world had taken the form of an escape into the high subjectivism of the inner life with its sublime epiphanies. Moreover, Cupitt's stark account of those Godless so-called "natural forces" was to surface again although in negated form among the North American Intelligent Design community (NAID). Many thanks to Don Cupitt for helping me make some sense of these situations, but perhaps not in the way he and the Sea of Faith movement would have applauded!"

In the first half of his first episode Cupitt surfed the usual "mechanical world" philosophical cliches, cliches which have led him (and many others) to a purely secularized view of science (which ironically itself ultimately has a tendency to undermine science). I nearly went to sleep, but about two thirds of the way through I was brought up with a jolt when he started talking about Pascal's night of personal revelation of the divine. This was stuff I hadn't heard before. Cupitt was describing the epiphany of Blaise Pascal which occurred late one Monday night in November 1643. This highly personal revelation not only calmed Pascal's spiritual angst but also gave him peace, joy, and an overwhelming sense of the presence of the divine. These are Pascal's words describing his experience.....

Fire....certainty, certainty, heart felt joy, peace. God of  Jesus Christ, God of Jesus Christ, My God and your God. Thy God shall be my God. The world forgotten and everything except God.....The world has not known thee but I have known thee. Joy, Joy, Joy, tears of joy.

According to Cupitt Pascal had a copy of the words of this experience sown into his clothing. Although we can praise God for Pascal's overwhelming epiphany and respect it, Pascal himself wasn't going to take a reciprocally magnanimous view toward the faith of his fellow philosophers like Descartes in spite of the fact that Descartes's philosophy was founded on and revolved around the divine. Re-quoting the passage from Cupitt that I've quoted above effectively accusing Descartes of deism....

Pascal who was an intensely Christian personality such lip service to religion was abhorrent. “I cannot forgive Descartes. In his whole philosophy he would like to do without God but he couldn’t help allowing him a flick of the fingers to set the world in motion. After that he had no more use for God”. That metaphysical God the God of the philosophers was not the God Pascal was privately seeking. 

That divine flick of the finger and the notion of a world with its own self-sufficient internal vitality and volition is the stuff of deist/atheist interpretations, interpretations which wrongly equate high organization with mechanical self-sufficiency. What is true however, is that once those secret algorithmic encodings which so successfully describe and metaphorically model the cosmos have been revealed to humanity this provides us with remarkable powers of information and control. That is, there is less need to inquire of God what the cosmos will do next or invoke magical rituals to keep it going because we know so much about the pattern of its God ordained dynamic. The Christian response to this gift of information & control should be one of the thanks & praise of beings utterly dependent on that God ordained order.  

For me nothing about the high organization which defines the physical world underwrites a deist or atheist world view although such are compelling conclusions for many. The deist/atheist intuitions, although understandable, become problematic with patterns of randomness and the expanding parallelism of quantum mechanics; these features have made it harder to swallow the elementary clockwork determinism of deism. 

Pascal railed against those compelling deist intuitions and sought an escape. But he appeared not to find an intellectual escape. He only found his escape in the depths of that deep heart felt epiphany of  Monday 23rd November 1643; an epiphany which as I've already related gave him feelings of peace and joy. Cupitt quotes Pascal as follows:

The god of Christians is not a God who is merely the author of mathematical truths in the order of the elements. He is a God who fills the soul and heart of those whom he possesses, who makes them inwardly conscious of their wretchedness and of his infinite mercy, who makes them incapable of any other end but him. It is the heart which perceives God, not the reason. The heart has its reasons of which the reason knows nothing.

In the parlance of today's touchy-feely Christian culture it is likely that Pascal's late night epiphany would be identified by many Christians in the last century as the "Baptism of the Holy Spirit" or perhaps more recently in this century as a more general sublime "encounter" with the divine constituting a heart experience of God rather than head knowledge of God. My response to this kind of thing has always been this: Different strokes for different folks; i.e. God reveals himself differently to different people and frankly when it comes to faith I'm more Descartes & Galileo than I am Pascal. 

***

But what was Don Cupitt himself trying to tell us when he related Pascal's encounter with God? Cupitt was trying to get past us the notion that the intellectual world of thought which has unlocked the secret mathematical order of the cosmos was a very different thing to the world of religious experience and religious thinking. It is true that since the enlightenment these two worlds have not only drifted apart but, according to Cupitt, have also become alienated from one another; so much so in fact that the world of the intellect can no longer convey religious meanings; religious meanings were now the domain of our religious intuitions, rituals, mystical metaphors and, best of all, sublime quasi-gnostic know-how; these alone could sublimate humanity's sense of the divine and that yearning for a God of some sort. According to Cupitt intimacy & certainty with respect to God was no longer to be found by the reason and certainly not via the physical cosmos; that profane world of mechanism whose sacredness had been banished by the enlightenment meant that the divine now only inhabited an idiosyncratic corner of the human mind.

What startled me about Cupitt's message is that it is not a million miles away from the message I was starting to hear from many contemporary "encounter"/"Holy Spirit" based Christians; for them knowing God was primarily about a profound heart experience of the divine and this was sharply distinguished from what they disparagingly referred to as a "head knowledge" faith. One heard about the 18 inch gap between heart and head and how difficult it can be to cross that gap. As far as my faith was concerned I always knew which side of this contrived divide I was going to end up on! As with Descartes my faith revolved round the head and not the intuitions and experiences of the heart. It became apparent to me that Cupitt's message was all too reminiscent of the gnosto-christian culture that I had experienced up until then. This realization of mine  became an even stronger theme for me in the following decade with the advent of the 1994 Toronto Blessing. Fortunately, the polarization that this induced in church life has, I think, lessened of late but it can still plague churches today (See here for example). But one thing was clear: The divide that Cupitt had identified and which has developed in the last 400 years between the sacred and the profane is of very general import; so much so in fact that it effects diverse subcultures in similar ways (The return to new age and pagan mysticism may be  a case in point) 

But diversity, equality and inclusion wasn't always on the agenda of  all churches; some of them had the same horror as Pascal of the God of the learned philosophers. They were quite sure that full gospel Christianity should, as an obligation, include an intimate, mystical and almost gnostic revelation of God's power. In the second half of the 20th century it was all to take a very bad and alienating  turn as it divided Christians along an intuition vs intellect fault line.

It was this head vs heart dichotomy which was subsequently to plague my relationship with church in the coming decades. The  church (or rather parts of it), like Pascal, could no longer reconcile the cosmic picture and the intellectual world of mechanism with God without doing violence to science (*1) and hence it escaped into the epiphanies and theophanies of almost orgasmic experiences of the divine and those experiences became a shibboleth of a quasi-gnostic flavour of Christianity. 

Thanks to Don Cupitt it became clearer to me how Christianity's retreat into the human heart was pressured by a spiritual  angst which was amplified by the enlightenment. But the  fault line between head and heart probably goes even deeper and the rise of a society based on technology and science merely widened an already archaic fault line. See for example, the Cathars and Diamaid MacCulloch's book Silence: A Christian History (London, Allen Lane, 2013)

***

Is Don Cupitt a Christian? Some would say that that is impossible for someone who seems to be atheist. I am, however, prepared to give Cupitt the benefit of the doubt for the following reasons. He undoubtedly knows Christian theology well. He has said that religion is potentially the depositary of our highest ideals and yet he is conscious of the human fallibility and sin which obstruct those ideals. My reading of him is that he understands the Christian doctrines which contrast the propensities of human selfishness with the vulnerable love we see in Christ and his self-sacrificing work to deal with sin. Via Pascal's perspective Cupitt tells us of The concealment of God's glory in the weakness of Christ and that God is most profoundly revealed in Christ's passion.....that may express the image of God Cupitt follows and worships in spite of the technical philosophical twist that Cupitt believes this image corresponds to no known real world entity. If Cupitt has taken onboard this image of God as his highest ideal to which he strives and he points to Christ as the epitome of this ideal he therefore eschews idolatry and follows the express image of the true God (See Hebrews 1&2). But in saying these things let me be clear; I can't be absolutely sure about Cupitt's private stance and secondly I don't follow him into doctrinal unrealism.


Footnotes

*1 e,g. Christian Young Earthism and Flat Earthism. Of course, in contrast there are some sophisticated & intellectual parts of church culture for whom science is well integrated into their faith - e.g. the Faraday Institute

Thursday, March 06, 2025

The Aumann's agreement theorem paradox.


Different perspectives implies a likelihood of different experience
sets and conflicting probability estimates, therefore setting 
 the scene for potential disagreement. 


I was rather intrigued by James Knight's use of Aumann's agreement theorem in a blog post of his that can be found here:

The Philosophical Muser: Why Christians Disagree So Much

James' post was a response to a challenge put to him that If Christianity is true, why are there such a varied set of Christians who disagree and squabble about so much? Towards the end of his article we read this (My emphases)....


Given the state of humanity, I’d no more expect Christians to agree on everything than I would mathematicians to agree about politics, or opera singers to agree about economics. But, I do wish they would – and as I often argue – Christians SHOULD agree more, especially on objective things – and two Christians of any sex, ethnicity, denomination, should converge on more and more consensus if they were to sit by the fire, Aumann’s Agreement-style, and honestly, rigorously seek the truth together, like people who care about what is true."


The exact "science" of Mathematics is a domain of knowledge incommensurable with politics and economics and no easy like-for-like comparison can be made. Mathematics is an activity, in fact a form of model building, which depends on very strictly agreed symbolic conventions and algorithmic procedures being followed. If in mathematics every one keeps strictly to the same conventions and procedures disagreement can't arise. The progress of mathematics bares this out; I'm not aware that mathematicians frequently and fundamentally disagree (except perhaps about un-proven conjectures). This of course is not so of politico-economics; disagreements about best economic policy and its political implementation abound. This why economics is a breeding ground for politicians and their political passions & power seeking; after all the only way to implement a particular contentious economic policy is to get political power (Hopefully by democratic means). But let's not think any better of those sanguine mathematicians over and against those battling politico-economists - the latter are dealing with very complex and epistemically tricky material which as we will see provides one reason among others why politico-economics breaks the assumptions of Aumann's agreement theorem and promotes the sharp divisions of power politics.....and that's before we consider those ever present very human psycho-sociological factors which one expects of complex adaptive systems like human beings. 

In fact I would rewrite the first sentence of the above quote as follows....

Given the natural state of human affairs, I’d no more expect Christians to agree on everything than I would politicians to agree about politics, or economists to agree about economics. 

***


Aumann's agreement theorem assumes we have a set of interlocutors who start with a common information base ("common priors") but then these interlocutors bring to the discussion table differing levels of knowledge in the form of conditional probabilities that all interlocutor believe to be a trustworthy contribution to the discussion (So-called "common knowledge" as opposed to "common information"). The interlocutors update their probabilities by mutual cooperative sharing of their differing conditional probabilities (*1).  They assume one another to be rational honest agents and that they can trust one another's probability estimates as they share them. According to the agreement theorem they will eventually converge on the same information set. See the following link for more on the agreement theorem: Aumann Agreement - LessWrong

Before I go any further let's get one thing straight. Most common sense people (which includes myself and people who believe there is such a thing as a single truth out there which stands over and above the slippery slopes of cultural relativism and critical theory) have an intuitive grasp of Aumann's theorem; that is, they understand that in an epistemically transparent world where evidence acquisition is not an issue and interlocutors are rational and honest, then agreement  about truth will inevitably emerge. So the agreement theorem proves what most common sense people already believe (Of course critical theorists and cultural relativists are likely to make heavy weather of this common sense truth). Aumann's theorem is a nice confirmation of what all reasonable people already know intuitively. But the article on Aumann's theorem that I have linked to above ends with this warning: 

The fact that disagreements on questions of simple fact are so common amongst humans, and that people seem to think this is normal, is an observation that should strike fear into the heart of every aspiring rationalist.

So, given the agreement theorem which is undoubtedly mathematically correct why is disagreement between humans so widespread? In this connection I made the following comment on James' blog entry. As a rule my comments never get past the Philosophical Muser's approval process and are therefore cancelled (The Philosophical Muser's concept of "free-speech" is qualified) So, rather than let my comments disappear into oblivion I thought this matter to be so important that it needs airing. What follows in the next section is based on the comment I added to James' post...

***

I think I agree with the general drift of your argument here but not in one or two of the details; especially, may I say,  you are missing the crucial point of the agreement theorem and vastly underestimating the epistemic issues impacting attempts to get agreement.

That cozy fireside talk seldom, if ever, arises. For a start whilst our interlocutors are locked in by the fireside they are not updating their experience or accumulating any further experience. They have to try and get agreement on the basis of the information they already have (in the form of priors and conditional probabilities). If this pool of information contains contradictions and they insist on sticking to their scripts they won’t necessarily reach full agreement even if they are rational.

Therefore our interlocutors are going to have to get off their backsides, get out on their bikes and find a set of consistent priors and conditionals. But that brings us to the main problem: This information can only come from statistics which result of a wide and long term experience of the cosmos. Moreover, any mature engagement with that cosmos requires thousands if not millions of bits of information. Single interlocutors, therefore can’t survey the whole lot; ergo, their experience is liable to being skewed and/or very partial. So, unfortunately our interlocutors, on top of all their other very human survivalist social traits, have to face the epistemic problem of systematic and random sampling errors.

The agreement theorem simply sets a lower limit on agreement time. That is:

Agreement time >= Aumann agreement time:

 

As I found out with my own AI Thinknet project AI systems also suffer from similar epistemic problems relating to sampling bias and partiality. After all, I think the YEC organization AiG have implemented their own YEC AI interlocutor presumably by training it with a bias on YEC texts.

As I’ve said before because of these fundamental epistemic limitations tribal identification & group think where one outsources epistemic help to the experience of a large group of minds is an adaptive trait and this factor shouldn’t be underestimated in terms of its potential epistemic utility. So what James refers to as “incentives, needs and agendas” have the potential to be adaptive whether we like it or not.

So, even without factoring in those many awkward human social foibles (which potentially have adaptive value), epistemic challenges alone are very likely to lead to agreement failure. My guess is that disagreement due to epistemic issues is the biggest factor in disagreement. The only antidote I see for this is epistemic humility. But the trouble with this is that when faced with utterly convinced  group-think such as we find in AiG & Trumpite brands of Christianity epistemic humility & tentativeness is read as weakness. Hence, a certain amount of vehemence is demanded in the heat of argument.


***

Olber's paradox was a famous theorem in astrophysics. This paradox shows that under plausible assumptions the night sky should not be black but a continuum of bright star light; the fact that this isn't the case pointed to the need to revisit the underlying assumptions; it was a profound piece of theoretical thinking which lead the way to our understanding of an expanding universe. I contend that likewise Aumann's theorem  prompts us to think a bit deeper as to why it's not a real world model; in particular it urges us to think about both our epistemological limitations and the complexities of socio-psychology which strongly influence the acquisition of knowledge. With respect to the latter we are prompted to investigate the adaptive value of group think & group belonging along with its potential downsides and tradeoffs. Because Aumann and his successors are making us think harder about human affairs then like Olber's paradox its pedological value can't be underestimated.

The upside of group think is that it widens the number of experiencing agents contributing to the conversation and this increases the amount of incoming evidence. It's true, however, that the instincts behind group think have a big potential downside as group-think can lock in error such as we see among cultists and fundamentalists who exploit the adaptive instincts; in this context the survival of the group identity takes precedence over further evidential updates. Aumann's theorem prompts us to study the cost/benefit balance entailed by joining an epistemic group with a strong sense of cohesion and collective identity. In this sense Aumann's paradox is as profound as Olber's paradox. 

***

I would want to rewrite the second half of the quote at the beginning of this post which I took from Philosophical Muser along these lines....

Christians are expected to make heavy weather of agreement, even about on objective things – and two Christians of any sex, ethnicity, denomination will not necessarily converge to a consensus if they were to sit by the fire, Aumann’s Agreement-style attempting to get convergence; disagreement is likely even if they honestly, rigorously and rationally seek the truth together, like people who care about what is true."

The Agreement theorem tells us that in principle agreement is possible if we get our priors and evidences right, but therein lies the epistemic challenge of gathering huge amounts of data some of which may present accessibility problems; this epistemic challenge necessitates that the quest for knowledge becomes a social symposium and this cues in all the foibles of the sociological dynamic. That these human and epistemic factors can make agreement problematical should always be at the back of our minds and therefore our difficulty in conforming to Aumann's theorem SHOULD be the basis of an attitude of epistemic humility rather than thinking that Aumann's theorem underwrites an attitude of epistemic arrogance; in my books this classifies as an abuse of the theorem. Agreeing to disagree until more information comes to light should not make us shudder.

But let me repeat and finish with this warning..... 

The only antidote I see for inevitable disagreement is epistemic humility. But the trouble with this is that when faced with convinced group-think such as we find in AiG & Trumpite brands of Christianity, humility is read as weakness. Hence, a certain amount of vehemence is demanded in the heat of argument.

Disagreement, sharp disagreement in fact, seems to be the natural state of human affairs.


Footnotes

*1 A conditional probability has the form "The probability of A given evidence B is P"; formally expressed as P(A/B). Here B is the evidence relevant to the truth of A. 

Wednesday, January 29, 2025

Bill Dembski's Information Conservation Thesis Falls Over


NAID's Stephen Meyer interviewed by Unwoke Right Wing Republican Dan Crenshaw. 
Evidence that NAID has become part of the  politicized culture war


I see that William "Bill" Dembski has done a post on Evolution News on the subject of the "Conservation of Information". The article is for the most part an interesting history of that phrase and goes to show that "information" has a number of meanings dependent on the discipline where it is being used, with Bill Dembski having his own proprietary concerns tied up with his support of the North American Intelligent Design (NAID) community.  See here for the article:

Conservation of Information: History of an Idea | Evolution News

Bill's particular information interest seems to lie with the so called "No Free Lunch Theorems". These theorems were about the mathematical limits on computer algorithms purposed to search for (and/or generate) configurations with properties of particular interest. Bill's focus on the "No Free Lunch Theorems" is bound up with the NAID community's challenge to standard evolution, a process which they see as a threat to their self-inflicted XOR creation dichotomy; viz: either "God Intelligence did it" XOR "Blind unguided natural forces did it" . 

But Bill gets full marks for spotting the relevance of these theorems to evolutionary theory: Evolution does have at least some features isomorphic with computer searches; in particular these theorems do throw some light on evolution's "search", reject and select mechanism which locks in organic configurations. So, the least I can say is that Bill's interest in the "No free lunch theorems" is based on what looks to be a potentially fruitful avenue of study. However, although it is true that the "No free lunch theorems" reveal interesting mathematical limits on computer searches, as we will see Bill has gone too far in trying co-opt these theorems for his concept of information conservation; in fact, to the contrary I would say that these theorems prove that Bill is wrong about the conservation of information.


                                                                                             ***

We can get a gut feeling for the No free lunch theorems with the impressionistic & informal mathematical analysis in this post. 

(Note: I arrived at similar conclusions in these two essays...

GeneratingComplexity2c.pdf - Google Drive

CreatingInformation.pdf - Google Drive 

These essays are more formal and cover the subject in more detail)

***

We imagine that we have a set of computer programs executing in parallel with the intention of finding out if at some point in their computations they generate examples of a particular class of configuration. These configurations are to be found somewhere in an absolutely huge domain of possible configurations that I shall call and which numbers D members, where D is extremely large.  It is a well known fact that most of the members of D will likely be highly disordered

A computer "search" starts with its initial algorithmic information  usually coded in the form of a character string or configuration S of length S. This configurational string contains the information informing the computing machinery how to generate a sequence of configurations C1C2,.....,Cn,.... etc. The software creates this sequence by modifying the current configuration Cn in order to create the next configuration Cn+1. A crucial operational characteristic of algorithms is that they are capable of making if-then-else type decisions which means that the modifications leading to Cn+1 will be dependent on configurational features found in Cn. It is this decisional feature of executed algorithms which gives them their search, reject and select characternot unlike evolution. This means that their trajectory through configuration space is often very difficult to predict without actually executing the algorithm. This is because the conditional decision-making of algorithms means that we can't predict what direction an algorithm will take at any one point in the computation until the conditions it is responding to have actually been generated by the algorithm. The concept of computational irreducibility is relevant here. 

In his article Bill is careful to describe the components of search algorithms, components which give them their search, reject & select character. But for my purposes we can simplify things by ignoring these components and only give cognizance to the fact that an algorithm takes its computer along what is possibly a non-linear trajectory in configuration space. We can also drop Bill's talk of the algorithm aiming for a specified target and then stopping since in general an algorithm can go on indefinitely moving through configuration space endlessly generating configurations as does conventional evolution. All we need to be concerned about here is the potentiality for algorithms to generate a class of configs of interest in a  "time" T where T is measured in algorithmic steps. 

                                                ***

If we have an algorithm with a string length of S then the maximum number of possible algorithms that can be constructed given this string length is Awhere A is the number of characters in the character set used to write the algorithm.

We now imagine that we have these possible As algorithms all executing in parallel for T steps. It then follows that the maximum number of configurations C which potentially can be generated by these possible algorithms of length S will be no greater than the limits set by the following relationship....

C <= As X  T

 Relation 1.0

...where C is the number of configurations that can be created in time T if the set of algorithms are run in parallel and assuming that a) T is measured in algorithmic steps and that b) the computing machinery is only capable of one step at a time and generates one configuration per step per algorithm.  

If the class of configurations we are interested in exist somewhere in a huge domain D consisting of D configurations and where for practically realistic execution times T:

                                     D >>> C

Relation 2.0

...then the relatively meager number of configurations our algorithm set can generate in realistic times like T are a drop in the ocean when compared to the size of the set of configurational possibilities that comprise D.  If relationship 2.0 holds then it is clear that given realistic times T, our "searches" will be unable to access the vast majority of configurations in D

With the above relationships in mind no free lunch starts to make some sense: If we are looking for algorithms which generate members of a particular class of configuration of interest (e.g. organic-like configurations) then for the algorithmic search to have a chance of succeeding in a reasonable time we require one of the following two conditions to be true...

1. Assuming that such exists then an algorithm of reasonable length S has to be found which is able to generate the targeted class of configurations within a reasonable time T.  However, if relationship 2.0 holds then it is clear that this option will not work for the vast majority of configurations in D.

2.  The alternative is that we invalidate relationship 2.0 by either a) allowing the algorithms of length S to be large enough so that A~ D, or b) allowing the execution time T of these algorithms to be sufficiently large so that T D,  or c) allowing that T and As when combined invalidate relationship  2.0. 

***

So, with the foregoing in mind we can see that if an algorithm is to generate a stipulated class of solution in domain D in a reasonable time T it either a) has to be logically possible to code the algorithmic solution in a starting string S of reasonable length S or b) we have to code the required information into a long string S of length S such that As ~ D. 

In case a) both S and T are of a practically reasonable magnitude from which it follows that given relationship 1.0 then little of the domain D  can be generated by such algorithms and therefore the majority of configurations that could possibly be designated as of interest in D (especially if they are complex disordered configurations) can not be found by these case-a algorithms. In case b) the starting string S, in terms of the number of possible algorithms that can be constructed, is commensurate with the size of D and therefore could possibly generate configurations of stipulated interest in a reasonable time. 

Therefore it follows that if we are restricted to relatively short algorithm strings of length S then these algorithms will only have a chance of reaching the overwhelming majority of configurations in D after very long execution times. If our configurations of designated interest are in this long execution time region in D these configurations will demand large values of T to generate. Long execution time algorithms, absent of any helpful starting strings which provide "short cut" information are I think what Bill calls "blind search algorithms". That emphasis on the word "blind" is a loaded misnomer which appeals to the NAID community for reasons which I hope will become clear. 

***


For Bill this what no free lunch means to him...

Because no-free-lunch theorems assert that average performance of certain classes of search algorithms remain constant at the level of blind search, these theorems have very much a conservation of information feel in the sense that conservation is strictly maintained and not merely that conservation is the best that can be achieved, with loss of conservation also a possibility

It's true that unless primed with the right initial information by far and away the majority of algorithms will reach most targets of an arbitrarily designated interest only after very long execution times involving laborious searching.....ergo, upfront information that lengthens S is needed to shorten the search; in fact this is always true by definition if we are wanting to generate configurations of interest that are also random configurations. 

So, the following is my interpretation of what Bill means by the conservation of information; namely, that to get the stipulated class of garbage out in reasonable time you have to put the right garbage in from the outset. The "garbage in" is a starting string S of sufficient length to tip the algorithm off as to where to look. The alternative is to go for searches with very long execution times T. So, paraphrasing Bill, we might say that his conservation of information can be expressed by this caricatured equation:

Gin = Gout

Relation 3.0

....where Gin represents some kind of informational measure of the "garbage" going in and Gout is the informational measure of the garbage coming out of the computation. But the following is the crucial point which as we will see invalidates Bill's conservation of information: Although relationship 3.0 gives Bill his conservation of information feel, it is an approximation which only applies to reasonable execution times.....it neglects the fact that the execution of an algorithm does create information if only slowly. That Bill has overlooked the fact that what he calls "blind searches" nevertheless slowly generate information becomes apparent from the following analysis.

***

If we take the log of relation 1.0 we get:


                                                         Log (C) <= S Log (A) + Log(T)

relation 4.0

The value C is the number of configurations that Aalgorithms will generate in time T and this will be less than or equal to the righthand side of the above relation. The probability of one these C configurations being chosen at random will be 1/C. Converting this probability to a Shannon information value, I, gives:

I = - Log (1/C) = Log (C)

relation 5.0

Therefore substituting I into 4.0 gives:

<= S Log (A) + Log(T)

relation 6.0

Incorporating Log (A) into a generalized measure of string length, S gives....

<= S + Log(T)

relation 7.0

From this relationship we can see that parallel algorithms do have the potential to generate Shannon Information with time T, and the information is not just incorporated from the outset in a string of length S. However, we do notice that because the information generated by execution time is the log function of T, that information is generated very slowly. This is what Bill has overlooked: What he derisively refers to as a "blind search" (sic) actually has the potential to generate information, if slowly. Bill's view is expressed further in the following quote from his article (With my emphases and with my insertions in red).....

With the no-free-lunch theorems, something is clearly being conserved [No, wrong] in that performance of different search algorithms, when averaged across the range of feedback information, is constant and equivalent to performance of blind search.[Log(T) is the "blind search" component] The question then arises how no free lunch relates to the consistent claim in the earlier conservation-of-information literature about output information not exceeding input information. In fact, the connection is straightforward. The only reason to average performance of algorithms across feedback information is if we don’t have any domain-specific information to help us find the target in question.[The "domain-specific" information is implicit in the string S of length in relation 7.0] 

Consequently, no free lunch tells us that without such domain-specific information, we have no special input information to improve the search, and thus no way to achieve output information that exceeds the capacities of blind search. When it comes to search, blind search is always the lowest common denominator — any search under consideration must always be able to do at least as good as blind search because we can always execute a blind search.[Oh no we can't Bill, at least not practically quickly enough under the current technology; we still await the technological sophistication to implement the expanding parallelism needed for "blind search" to be effective, the holy grail of computing. "Blind search" is a much more sophisticated idea than our Bill and his NAID mates are making out!] With no free lunch, it is blind search as input and blind search as output. The averaging of feedback information treated as input acts as a leveler, ensuring parity between information input and output. No free lunch preserves strict conservation [Tough, not true!] precisely because it sets the bar so low at blind search.

By distilling its findings into a single fundamental relation of probability theory, this work provides a definitive, fully developed, general formulation of the Law of Conservation of Information, showing that information that facilitates search cannot magically materialize out of nothing but instead must be derived from pre-existing sources.[False; information derives not just from S, but can also creep in from an algorithm's  execution time T ]

Blind search, blind search, blind search, blind, blind, blind,...... the repeated mantra of NAID culture which with its subliminal gnosto-dualism repeatedly refers to the resources of God's creation as a place of "blind natural forces". Sometimes you will also hear them talk about "unguided natural forces". But in one sense I would maintain the cosmos is far from "natural", and this is evidenced by the  sense of wonder its highly contingent form engenders among theists and atheists alike, all of whom can advance no logically obliging reason as to its highly organised configuration (accept perhaps Richard Carrier whose arrogance on this score would do Zaphod Beeblebrox  proud)

Bill's last sentence above is clearly false, as false can be; he's overlooked the slowly growing information term in relation 7.0. Information is not conserved during a search because the so-called "blind search" (sic) term is slowly, almost undetectably creating information. There is therefore no "strict conservation of information" (sic). That the so-called "blind search" (sic) is being understated by Bill and the NAID culture he represents becomes very apparent as soon as we realize that equation 7.0 has been derived on the assumption that we are using parallel processing; that is, a processing paradigm where the number of processors doing the computation is constant. But if we start thinking about the exponentials of a process which utilizes expanding parallelism the second term on the righthand side of 7.0 has the potential to become linear in T and therefore highly significant. This is why so much effort and cash is being put into quantum computing; Quantum computers clearly create information at a much more rapid rate and it is the monumental resources being invested in this line of cutting edge research which gives the lie to Bill's contention that information is conserved during computation and that somehow "blind search" rates as a primitive last resort. 


                                                                       ***

As far as the big evolution question is concerned I regard this matter with studied detachment. God as the sovereign author of the cosmic story could introduce information into the cosmic configuration generator using either or both terms in relation 7.0; in particular if unlike primitive humanity at our current technological juncture God has at his finger tips the power of expanding parallelism to crack the so called blind search problem the second term on the righthand side of 7.0 has the potential to become significant. Accordingly, I reject NAID's wrongly identified "blind natural forces" category when those forces are in fact highly sophisticated because they are in the hands of Omniscient Omnipotence. The trouble is that the NAID community have heavily invested in an anti-evolution culture and it looks like they've past the point of no return, such is their huge social and tribal identification with anti-evolutionism. Ironically, even if bog-standard evolution is true (along with features like junk DNA) we are still faced with the Intelligent Design question. As for myself I have no indispensable intellectual investment in either the evolutionist or anti-evolutionist positions.

                                                    ***


As I have remarked so many times before, what motivates NAID (& YEC) culture's aversion to the idea that information can be created by so-called "blind natural forces" is this culture's a priori anti-evolution stance. Underlying this stance, I propose, is a subliminal gnosto-dualist mindset, and this mindset in this subliminal form afflicts Western societies across the board, from atheism to authoritarian & touchy feely expressions of Christianity; in fact Western religious expression in general. But that's another story. (See for example my series on atheist Don Cupitt - a series yet to be completed)

What's compounded my problem with NAID & YEC nowadays is their embrace of unwoke political culture, a culture which automatically puts them at odds with the academic establishment. I'll grant that that establishment and its supporters have often (or at least sometimes) subjected outsiders (like Bill for example) to verbal abuse and cancellation (e.g. consider Richard Dawkins & the Four Horseman, RationalWiki etc.). This has help urge them to find friends among the North American far-right academia hating tribes and embrace some of their political attitudes (See here). As I personally by and large support academia (but see here) it is therefore likely that I too would be lumped together by the NAID & YEC communities as a "woke" sympathizer, even though I reject any idea that the problems of society can be finally fixed by social engineering initiated centrally, least of all by Marxist social engineering. But then I'm also a strong objector to far-right libertarian social engineering which seeks a society regulated purely by a community's use of their purses (and then be pray to the chaotic non-linearities of market economics and power grabbing by plutocratic crony capitalists). In today's panicked and polarized milieu the far-right would see even a constitutional Royalist like myself who is also in favour of a regulated market economy, as at best a diluted "socialist" and at worst a far-left extremist, ripe for the woke-sin-bin!



NOTE: An article on "Conservation of Information" has recently popped up on Panda's Thumb. See here: Conservation of arguments

Friday, January 24, 2025

Let's Carry on Carriering Part IV



In this post I continue analyzing a web post by self-recommending professional atheist Richard Carrier. 

The other parts of this series can be seen in the links below:

Quantum Non-Linearity: Let's Carry on Carriering Part I

Quantum Non-Linearity: Let's Carry on Carriering Part II

Quantum Non-Linearity: Let's Carry on Carriering Part III

Before going on with the rest of Richard's post, below I recap Richard's proposition 8 and comment on it again.

***


RICHARD: Proposition 8: If every logically possible thing that can happen to Nothing has an equal probability of occurring, then every logically possible number of universes that can appear has an equal probability of occurring.

This is logically entailed by the conjunction of Propositions 6 and 7. So again it cannot be denied without denying, again, Proposition 1.

MY COMMENT: In the above quote Richard is telling us that given this entity he calls Nothing we can infer that every logically possible universe that can arise from Nothing has an equal probability of occurring. As I have said in the previous parts, probability isn't a property of the object we are taking cognizance of (in this case the object is Nothing) but a function of the observer's level of knowledge about an object; the negated way of saying the same thing is that probability is a measure of the observer's ignorance. In the above, therefore, Richard is merely telling us that he has no idea what Nothing is capable of generating and that all logically feasible bets are therefore of equal probability; this equality correctly follows from the principle of equal a priori probabilities, a principle which applies to any observer who has no information which leads him/her to expect one bet over another. In this instance one of those bets includes whether or not Nothing will generate the high contingencies and complexities of random patterns. Because Richard is admitting that he knows 0.0% of nowt about Nothing these observer-based betting odds say nothing at all about what Nothing will actually generate. 

I will now continue analyzing Richard's post from where I left off in the last part (Part III).  From this point on his work is a teetering tower of endeavor with its foundations resting upon the implicit assumption that probability is a physical property which he also assumes must logically entail a totally random process; that is, a process which generates random patterns. 

***


RICHARD: And this is true regardless of the measure problem. There are lots of different ways you can slice up the “outcome” of a totally random process that’s unlimited in how much can happen—how much “stuff,” and in how many configurations, that can arise. But insofar as the “stuff” that pops out is connected to other stuff, it necessarily causally interacts with it, and that logically entails a single causally interacting “system,” which we can call a “universe” in a relevant sense. But when there is Nothing, nothing exists to make it even likely, much less ensure, that only one such “universe” will randomly materialize.

Of course, even within a single causally interacting “system,” (= a "universe") and thus within a single “universe,” it is not necessarily the case that every part of it will have the same contents and properties. Eternal inflation, for example, entails an initial chaotic universe will continue splitting off different bubble universes forever, and everyone will have different laws, contents, and properties, insofar as it’s possible to. And this is actually what we usually mean by “universe” now: one of those regions of the whole metaverse that shares a common fundamental physics (the same dimensionality of spacetime, the same fundamental constants, and the same causal history). Other regions may differ, e.g. if we fly far enough in space, maybe a trillion lightyears, we might start to enter a region of the universe where the laws and constants and shape and contents start to change.

MY COMMENT: Probabilities are defined in terms of ratios of sets of possibilities: The measure problem concerns the difficulty observers have in defining probabilities when trying to form ratios from ill-defined sets of possibilities, particularly potentially infinite sets of possibilities. If one is faced with sets of possibilities for which it is not easy to define clearcut size comparisons, then calculating probabilities (which are based on ratios of possibilities) becomes problematic. In this instance Richard is discussing the question of what class of possibilities constitutes what we would like to call a "universe" and how we measure the probability of a "universe" against the immense set of "all" possible universes. The comparison of these spectacularly vague and huge sets and the accompanying calculation of the relevant probabilities are sensitive to the methods of comparison.  (See here). 

At the start of the above quote Richard is telling us that his proposition 8 isn't affected by the measure problem; well, that may be true: For his purposes it is often enough to show that one set is  clearly much, much larger than another thus implying that the probability in question is all but zero and therefore its negated probability is all but unity. But for Richard to take us any further one first has to swallow his two seemingly unconscious assumptions: Viz:

1. That probabilities are an intrinsic property of an object when in fact they are an observer relative extrinsic property in so far as being a function of an observer's knowledge about an object. 

2. That the existence of a probability necessarily implies something capable of generating random patterns (certainly not true!).

Regarding assumption 1: Going over the point I have repeatedly made: The one-on-one element-by-element comparison between two sets needed to create ratios of possibilities and underwrite the calculation of the probability of a universe is only intelligible if we first assume, a priori, the existence of highly sophisticated third person observers for whom probability ratios (which are a measure of observer information level) are meaningful and interesting. Without the assumed existence of sufficiently cognitively sophisticated observers, probability is an unintelligible notion. Probability is not a property of something "out there" whether of universes or other; it is a measure of an observer's information about the object in question.

Regarding assumption 2:  That this seems to be some kind of habit of mind is more than hinted at when Richard says in the above quote about “the stuff” that pops out which presumably is the “outcome” of a totally random process. However, to be fair to Richard it is true that the term "probability" is often used as a metonym for randomness because the algorithmic intractability of random patterns makes them difficult to know and therefore random patterns very often entail a probability.  One of Richard's pratfalls is that I think he's conflated the use of the term "probability" as a metonym with the object it is frequently associated with (i.e. random patterns). As we will see, given a probability he wrongly infers that he has in his hands a random generator of universes; a non sequitur if there ever was one. 

***


RICHARD: However, we needn’t account for this in what follows. If it is the case—in other words, if universes in the broad sense (causally interacting systems) can themselves contain even more universes in the narrow sense (regions of a shared fundamental physics), then what follows, follows with even more certainty. Because then there are even more “universes” to make the point with. You will notice eventually how this simply makes the math even stronger, and gets us to the same conclusion with even greater force. Because all adding this does to the math, is increase how many universes a Nothing will inevitably randomly produce.

MY COMMENT:  If for the sake of argument we allow Richard's two assumptions above to slip past us then it's true that the measure problem doesn't affect his conclusion: Although we may be unable to come up with the rigorously correct ratios of possibilities it is often clear that the sets of possibilities Richard is comparing are obviously vastly different in size and so it is clear that the probabilities concerned are as near as can be to either 100% or 0%.

But the conclusions Richards draws from this exercise of probability calculation are based, once again on the falsehood I've emboldened at the end of the above quote: Viz: Richard thinks he's proved to himself that the logical truism he calls Nothing will inevitably randomly produce...“stuff” that pops out.

Richard's argument is that if we do at least know we are dealing with huge numbers of possible universes this is only going to add more grist to his mill by feeding his gluttony for immense numbers of possible outcomes. But unfortunately for Richard there is no wind or water to drive his mill: As we can see from his last sentence above, he's assuming that observer-defined probabilities necessarily entail a random pattern generator which he is hoping will drive his system of universe creation. Well, whatever complex logical necessities Nothing contains one thing is clear; the generation of random patterns is not known to logically follow from the Unknown and Mysterious logical necessity Richard calls "Nothing" and about which Richard can tell us very little. And again: The generation of random patterns doesn't follow as a logical necessity from observer defined probabilities whether those probabilities are calculated correctly of not.

Richard's misconceptions around probability and randomness are continuing to run through his thinking.  He needs to revisit his bad habits of mind about probability and randomness. 

***


RICHARD: The converse is also true. If it is somehow the case that there can’t be disconnected systems, that somehow it is logically impossible for Nothing to produce multiple “universes” in the broad sense, then it must necessarily be the case that it will produce, to the same probability, multiple universes in the narrow sense. Because there is only one possible way left that it could be logically impossible for both (a) Nothing to produce more than one causal system and (b) that system be entirely governed by only one physics, is if this universe we find ourselves in is the only logically possible universe. And if that’s the case, then we don’t need any explanation for it. All fine-tuning arguments sink immediately. The probability of any universe existing but this one (given that any universe exists at all) is then zero. And the probability of fine tuning without God is then exactly and fully 100%.

MY COMMENT:  A largely Valid point here: Richard is admitting that Nothing is such a big Unknown that it is conceivable that by some logic we don't yet understand Nothing entails that only one causally connected universe can exist and that this is the universe we observe (if perhaps only a small part of a much broader causally connected universe). But I doubt he'll bite this bullet: His concept of Nothing is his subliminal stand in for "The Unknown God" in so far as this mysterious Nothing somehow implies the highly organized universe we see around us. 

***


RICHARD: I doubt any theist will bite that bullet. I’m pretty sure all will insist that other universes are logically possible. 

MY COMMENT:   Theist or not I think we can be agnostic about whether or not other universes are logically possible. After all we know so little about this mysterious entity which Richard keeps calling "Nothing"; we don't even know if the logic of Nothing rules out cosmic configurations that otherwise to us seem logically possible. 

***

RICHARD: And if other universes are logically possible, it must necessarily be the case that it is logically possible either for different regions of a universe to exhibit different physics or different universes as closed causal systems to exist (with, ergo, different physics). Therefore, by disjunctive logic, if the second disjunct is ruled impossible (“different universes as closed causal systems can exist”), the first disjunct becomes a logically necessary truth (“different regions of a universe can have different physics”). Even if one were to say “there are infinitely many outcomes logically equivalent to a single universe with a single uniform physics” and “therefore” there are as many such outcomes as any version of multiverse and so “it’s fifty fifty” or “the measurement problem gets you” or whatever, Cantor strikes: as all the infinite such possible universes are already contained in possible multiverses and yet there are infinitely many more multiverses possible which cannot be included in the previous infinite set, the cardinality relation of possible multiverses to possible singleverses is still infinitely more; ergo, the probability of getting “a singleverse” rather than “a multiverse” is infinity to one against.

MY COMMENT:    Yes, I agree the number of possible multiverses, if compared against the number of possible singleverses, will be infinitely greater.  But if this relationship is to be transformed into a probability as per the last sentence (Viz the probability of a singleverse against a multiverse) Richard once again must assume the pre-existence of a sufficiently sophisticated observer to make the calculation of his probability meaningful. But Richard's logic here although valid is irrelevant; these observer relative probabilities imply nothing about what Richard's "Nothing" will in actual fact generate. 

***

RICHARD: Therefore, when there are no rules governing how many “universes” can randomly arise from Nothing, there must necessarily be either a random number of universes in the broad sense (causally separated systems) or a random number of universes in the narrow sense (regions of different physics within a single causal system), or both. Including, of course, the possibility that that number, either way, will be zero. Which is what it would mean for Nothing to produce nothing, to remain eternally nothing. Ex nihilo nihil, in other words, is simply describing one possible outcome of a true Nothing: the outcome of there being zero things arising.

But as we just confirmed, there is no rule or law that entails the number of things that will arise uncaused from Nothing is zero. In fact, zero is just one possibility out of countless other possibilities: countless other numbers of things, and thus universes, that can arise. And Proposition 6 entails each possible outcome has the same probability as each other possible outcome. Which means no outcome (such as “zero”) is more likely than any other (such as “one” or “ten billion” or “ten to the power of twenty trillion”). Hence, Proposition 9....

MY COMMENT:   And again, the bulk of the deliberations above are irrelevant. Richard's attempt to make numerical comparisons between classes of possible universes and thus arrive at one or other end of the probability spectrum is futile without building in his two hidden prior assumptions: To repeat: 1. The a priori existence of a sufficiently sophisticated cognitive perspective to make the probability calculations meaningful 2. In this particular connection, the a priori existence of the super contingency of random pattern generators to a give meaningful hook to the observer's probability calculations.

That Richard's "Nothing" is a huge Unknown to him is evidenced by the fact that above we find him considering the case where, for all he knows, Nothing has no known rules to limit the classes from which probabilities can be calculated. He then, yet again, wrongly thinks that from these probabilities he can logically infer a random pattern generator.  Moreover, random pattern generation is a rule in itself which contradicts any notion that Nothing has no rules. 

***

RICHARD: Proposition 9: If when there is Nothing every possible number of universes has an equal probability of occurring, the probability of Nothing remaining nothing equals the ratio of one to n, where n is the largest logically possible number of universes that can occur.

MY COMMENT:  Given that our Richard is admittedly working completely in the dark as to what the logic of Nothing entails then given such an advanced state of ignorance it is true that every possible universe has an equal probability of being generated by Nothing; and this includes the possibility of literally nothing being generated by Nothing. Well, there is only one way to generate absolutely nothing, so therefore Richard is right in telling us that the probability of nothing is 1/n, where n is the largest logically possible number universes that can occur. But yet again: We can't move from this state of hyper ignorance, expressed as a probability, to the conclusion that from this ignorance we can then infer a random generator of universes is at work. The quantified ignorance expressed by a probability evaluation tells us nothing about what Nothing will actually generate, least of all whether it will generate the hyper-complexities of random patterns.

***

RICHARD: But Proposition 6 entails n is transfinite. There is no maximum possible universes that can arise. This creates difficulties for continuing mathematically here, because no one has fully worked out a mathematics of transfinite probability. We can bypass that problem, however, the same way Archimedes originally did, by adapting the Method of Exhaustion. We’ll get there in a moment.

MY COMMENT:  No dispute that n is transfinite. But you bet there's going to be huge difficulties in defining intelligible probabilities here because measure problems make the definition of coherent ratios of possible universes highly problematic. But let's wait see what Richard's method of exhaustion entails. Something to look forward to in Part V.  

***


RICHARD: Proposition 10: If Nothing produces a random number of universes, nothing exists to prevent the contents of each of those universes from being equally random.

In other words, if it is logically possible for any universe, upon coming into existence, to have a different set of attributes than another, then each possible collection of attributes is as likely as every other. This follows by logical necessity from the absence of anything that would make it otherwise. And Nothing lacks everything, including anything that would make it otherwise. To deny this Proposition therefore requires producing a logical proof that some logical necessity makes it otherwise. Good luck.

MY COMMENT: Richard has not established that Nothing generates universes at random. All we've seen is that from the carefully measured human ignorance expressed as probabilities he's then assumed that this mysterious object he's called Nothing at least has the possibility of generating the high contingencies & complexities of randomness.  In fact in the above he does venture to assert something about Nothing; that is, that Nothing lacks everything, including anything that would make it otherwise. And yet he's somehow inferred that if Nothing produces a random number of universes, nothing exists to prevent the contents of each of those universes from being equally random. That is, he's allowing Nothing the possibility of generating the highly sophisticated complexes of random patterns. He has inferred that a lack of logical restriction logically entails the possibility of random patterns being generated. So, Richard where's the logical proof that there is some logical necessity which allows Nothing the possibility of generating these high contingencies? Good luck with that one Richard!


...to be continued