Showing posts with label Intelligent Creation. Show all posts
Showing posts with label Intelligent Creation. Show all posts

Tuesday, August 05, 2025

Classic Dualism


Courtesy of the Faraday Institute

I'm part of a Facebook group called Evangelicals for Evolutionary Creation. This is not to say that I've committed myself to standard Evolutionary thinking, but I feel that this group are worthy thinkers to keep an eye on. However, somebody put the following comment on their FB feed....

So I’m getting toward the end of Origins by the Haarsmas. A question arises, if abiogenesis is true, how does this not prove that life can happen without God? This kind of concerns me and it seems to be an open question in evolutionary creationism.

I believe that "Haarsmas" is a reference to Deborah Haarsma, the current president of Biologos, the Christian evolutionary creation organisation. I didn't comment on this statement as the Evolutionary Creation people are more than capable of critiquing such a breathtakingly naive perspective, a perspective with widespread appeal among both Christians and atheists. On this view it's a binary choice: "Either God did it or evolution did it"

I've no doubt said something like the following many times before: Since the enlightenment Western science has merely shown us that the cosmos is sufficiently organized for us to form succinct mathematical statements describing its dynamics. As many Christians fully understand, those descriptions in and of themselves only tell us about the "how?" of the cosmos and not the "why?" - but the "why?" is only a meaningful question if one first accepts that sentience, intelligence and purpose are a priori features of existence.

 If  anything this strange mathematical descriptive elegance only compounds the enigma of the cosmos and tells us little about absolute origins; that isthe ultimate gap, a gap that descriptive science is logically incapable of filling and if pressed simply leaves us with an elegant-descriptions-all-the-way-down regress. In fact, since we have no logically obliging reason for the continued existence of the contingencies of our cosmic reality that ultimate gap is everywhere and everywhen. 

And yet the dualistic view expressed by the above quote is the common default: That is "either God did it or cosmic processes did it"; the underlying assumption of this perspective is that somehow the enigma of cosmic organization has a logical self sufficiency which at best only leaves room for the God of deism or at worst no God at all. Such a perspective might have its origins in the early enlightenment/industrial era when it started to become much clearer that mechanisms (such as a steam regulator & automata) could be developed which meant that machines looked after their own running. The popularist conclusion was that the cosmos must be that kind of mechanism. Such mechanisms appeared not to need any prayerful ritualistic support or mystical input of any kind to continue. On this perspective sacredness seems to have been purged from what was now thought of as a self sustaining profane cosmos. 

But the realization that such mechanisms were so startingly sophisticated enough to beg the question of their design seems to have been lost on many people: One such person in our modern era is (atheist?) theologian Don Cupitt of the Sea of Faith movement. Also, blowhard atheist Richard Carrier is of this ilk. Carrier is so convinced by the sophistry of his flawed view of probability and randomness that he believes probability to be logically sufficient to fill in the God-gap.  And yet Carrier succeeded in identifying that our cosmic context lacks some logically self-sufficient kernel, although Carrier's erroneous concept of probability doesn't provide that kernel. 


***

It is surely ironic that the self same virtuoso cosmic organization which for some fills in the God-gap actually intensifies the nagging enigma of the absolute origins question; the contingent particularity of that organization is amazing. In fact as I have shown, evolution itself (if it has occurred) is effectively creationism on steroids.  And yet it is the underlying dualism of God vs evolution that much of the North America Intelligent Design movement (NAID) trades on. They will deny it of course, but whenever they open their mouths it is easy to see that they are exploiting the popularist God-of-the-gaps "Intelligence vs blind natural forces" dichotomy. To attack standard evolution on the scientific basis that the evidence is insufficient is one thing but to attack it on the basis of a half-cocked dualist philosophy is quite another - and I put it to the NAID community that although they affect to claim theirs is a scientific dispute their ulterior reasoning is in fact based on the popular appeal of their philosophical dualism, whatever they might claim. That appeal, however, is understandable I suppose because the above quote from a Facebook page is in fact the tip of a huge market iceberg of popularist thinking which the NAID's dichotomized explanations address and by which they make their money, trade and continue in mutual backslapping. For more on NAID see here, here and here.



NOTE: Luskin's God-of-the-Gaps paradigm

As I've made it clear before I don't think much of NAID theorist Casey Luskin's competence as an apologist for Intelligent Design. This post on Evolution News, which describes Luskin's views, cements his reputation as a God-or-of-the-Gaps apologist.  As I've said above I have no intellectual commitment to standard evolutionary theory, but what is clear, evolution or no evolution, one cannot get away from the question of intelligent design. That Luskin is so anti-evolution, a priori, is evidence that he still thinks subliminally in dualist and atheist categories in so far as he believes it to be  a choice between "blind natural forces vs intelligent design"..... where he interprets evolution atheistically in terms of "blind natural forces". Ergo, Luskin is a God-of-the-Gaps apologist whatever he claims. 

Saturday, August 31, 2024

Examining Mr. R. Carrier's use of Bayesianism. Part IV


A gross theological caricature


(See here for Part I, Part II and Part III)

In part IV of this series, I'm continuing to comment on the following post by a Mr. Richard Carrier:

Why the Fine Tuning Argument Proves God Does Not Exist • Richard Carrier Blogs 

As Richard stares out at our strange cosmos and considers the question of theism and whether or not a cosmos like our's would have been produced by the kind of God conceived by most theists, this is what he thinks:

It cannot be predicted that this [Universe] is what a God would produce, or that it is what he would want to produce. Whereas it is exactly 100% predicted to be what we’d see if there was no God

I would certainly question Richard's second sentence here: What kind of universe/cosmos would I have predicted if there was no God? As we saw in the previous parts I certainly wouldn't have predicted our own remarkable universe in all its organized complexity, it's surprising organized contingencies and above all an organization which gives it a very strong propensity to generate life....... especially that propensity to generate complex organic objects! After all, only in recent history have humans started to master systems capable of generating other systems.  Why wouldn't I have predicted all this in the absence of God? .... because the evidence of our experience is that organization of all types, both simple and complex, are associated with the activity of human (and animal) intelligence. Therefore, when I see a cosmos so organized that we can distill out of it those highly succinct mathematical laws of physics, laws which are crucial for the generation & maintenance of life, my intuitions turn to thoughts of an a priori intelligence being active. Moreover, the fact is that the laws we distill from cosmic organization can never have the property of Aseity (that is of self-explanation). This is because these laws are mathematically descriptive devices destined to always leave us with a hard core of irreducible, incompressible and enigmatic contingent information; those laws are therefore logically incapable of delivering the logical necessity of Aseity. Some atheists at least do understand this. Take for example atheists Galen Strawson and Sean Carroll: Both appear to understand that all probing human inquiry into the form and pattern of the cosmos must eventually bottom out with unexplainable brute fact: Aseity is beyond the reach of conventional descriptive science.  This is a mathematical truism. See the following links for more details...

Quantum Non-Linearity: Galen Strawson on "Why is there something?" (quantumnonlinearity.blogspot.com)

Quantum Non-Linearity: Something comes from Something: Nothing comes from Nothing. Big Deal (quantumnonlinearity.blogspot.com)

There have been some who have tried to get round all this by suggesting that somehow quantum mechanics can be used to redefine nothing in such a way that it tells us how it is possible to get something from nothing: But this line of thought is achieved by mere empty linguistic tricks: One can use the same tricks to claim that this simply amounts to a redefinition of something! (See footnote *2)


***

And yet I'm inclined to agree with Richard's first sentence in the quote above:  I don't think I could have predicted that the kind of God I think I know would have created the specifics of our universe, not only because of its strange impersonal and dispassionate vastness but also because of the much closer to home, well aired and time-honored conundrums around suffering and evil. Yes, I might have predicted a highly organized universe, but organization covers a multitude of possibilities, and it seems a multitude of sins. So, I do have some sympathy with honest atheists on this point. (But types like Richard don't want sympathy & measured opinions; they want abject submission to their thinking; his attitudes match those of the hardened fundamentalists of Biblical literalism).

Moreover, based on our experience of intelligent activity in this world (which by & large is human and animal) we have to admit that not only does intelligent activity have an immense space of creative options open to it making anticipation of specific activity in the absence of evidence all but impossible, but also that intelligent activity has a fair measure of inscrutability. For example, the ancient stone circles we see dotted around Europe entail a high level of organization both in their configuration and the logistics of their construction and yet as to their purpose we have to resort to hypothesis and speculation. Furthermore, coming from a vacuum of evidence I could not have predicted from first principles that early cultures (probably as a consequence of that time honoured search for cosmic meaning & purpose) would build stone circles. Because of the huge variety open to intelligent behavior I can't move from an evidential vacuum to stone circles. But the reverse is possible: Given the evidence of stone circles I can link that to known aspects of the human psyche, a psyche I share. This means we have at least some inkling of the motives driving the human organization of inanimate objects and therefore have a chance of interpreting the meaning of this activity; in this case that the stone circles probably represent a culture's attempt to engage with the numinous and seek to give shape, meaning, and purpose to the universe; I personally think I understand that mystical endeavor. 

Likewise, as we look out onto the cosmos itself, we observe high levels of organization in a pattern we couldn't predict even if we knew beforehand that a creating deity was behind it. But conversely, if we are sufficiently primed theists, we at least stand a chance of getting a purchase on cosmic purposes via theological hypothesis and speculation. But if we reject God's attempt at self-revelation and we reject the necessity of the epistemic bootstrap of faith (See Hebrews 11:3&6), we will remain as much in the dark about Divine purposes as we are about those enigmatic stone circles. For it is possible in my view to come up with at least a hypothesized framework as to the meaning of the cosmos. 


***

But now I ask myself this: What would I have predicted if there is no God of any sort? My first intuitive response to that question would be absolute empty nothingness; but this is patently not the case: Our conscious perceptions tell us that the universe exists and therefore we do have an evidential handle on this question. In fact, as I said in Part III of this series, if the evidence was that the universe is completely random (That is, a Big-R superverse), I would interpret that as evidence of the absence of the God I think I know. As Sherlock Holmes observed in the story of The Cardboard Box where he was commenting on a particularly tragic case of crime...

“What is the meaning of it, Watson?” said Holmes, solemnly, as he laid down the paper. “What object is served by this circle of misery and violence and fear? It must tend to some end, or else our universe is ruled by chance, which is unthinkable. But to what end? There is the great standing perennial problem to which human reason is as far from an answer as ever.”

(See the introduction to my book on Disorder and Randomness where I first used this quote)

But whilst I'd agree that our intuitions suggest that Big-R points to atheism, the reverse isn't true: Viz: Given atheism I wouldn't have been able to predict a Big-R universe: The consequences of the absence of God are just as inscrutable as God himself. In any case a prediction of Big-R isn't a straightforward deduction from the absence of God. Let me explain...

Firstly, in a Big-R universe I wouldn't exist to perceive anything and neither would anyone else. Being an idealist where I regard conscious cognition and perception as an important underwriter of reality, I would therefore question the coherence and intelligibility of Big-R notions.

Secondly, randomness represents the very opposite of a logical truism; A logical truism, once understood, has zero surprisal value and therefore no information, whereas randomness has maximum surprisal value and maximum information. If you are looking for the logical necessity of explanatory completeness or aseity you won't find it in randomness. The existence of randomness entails maximum contingency and maximum mystery. It is first and foremost the very opposite of a logical truism, the very opposite of "necessity".  It therefore explains nothing in the sense of explanatory completeness; rather it just leaves us with a conundrum as to who or what is managing to generate the most complex pattern of all, a pattern that requires a maximum of computational effort. 


***

In the following quote we find Richard continuing to dig even deeper into the hole he is already in....

RICHARD: Thus, Fine Tuning is not a “peculiar” thing for us to observe. It is not distinctive of God-made universes; it is, rather, distinctive of godless universes. It is literally the only thing we could ever observe—unless God existed and made the universe. Because only then could the universe possibly have been made conducive to life without the Fine Tuning of our peculiar fundamental constants. Hence God-made worlds will tend to not be Fine Tuned.

MY COMMENT:  As we saw in Part III, so-called "fine tuning" is just a small facet of a much bigger story of a remarkable order which has facilitated the human project of distilling out of the pattern of that order some remarkably elegant mathematical forms which from my standpoint have a very divine feel about them. They look to be the very epitome of an incredibly intelligent design. And let me repeat, further "explanation" of these forms can never deliver aseity but could only ever be a further enhancement of the succinctness of their form; but increasing mathematical succinctness can't go on until one has nothing left to compress; an incompressible kernel of contingency will always remain using mathematics as we know it. 


***

Richard Carrier has a very low view of our Cosmos. In spite of its exceptional and highly stable order, an order strongly conducive to the emergence and maintenance of life Richard still courts the Big-R hypothesis, the random bizarro universe that can be used to explain away anything. Take a look at the following...

RICHARD: This is a crucial realization. Fine Tuning of our observed fundamental constants is only necessary when a God is not doing the designing; it is only necessary when observers only evolve through billions of years of gradual cellular scaffolding, and life at all arises only by chance chemical mixing, and only after billions of years of the meandering random mixing of chemicals across a vast universe billions of light-years in size filled with random lifeless junk, which is almost everywhere lethal to life, and only hospitable to it in tiny specks of the chance arrangement of randomly mixed conditions. Only those conditions require Fine Tuning. Quite simply put: only Godless universes have to be Finely Tuned.

Which means when you observe a universe like ours (old, huge, deadly, and producing life only in the most awkward of ways and rarest of places), you can expect it to have been Finely Tuned by chance accident, not intelligent design. Intelligent design would more likely make a universe as large and old as needed to contain the life it was made for, and would create life directly (not employ billions of years of cellular scaffolding), and imbue the world with only those laws of physics needed to maintain it to its purpose (no weird fundamental constants, no weird fundamental particles). It would not produce a universe almost entirely hostile to life. There would be no lethal radiation-filled vacuum. No dead worlds or lifeless moons. Stars would not be uninhabitable monstrosities. Black holes would never exist.

MY COMMENT: And again: Chance fine tuning is a very bad argument for atheism; it neglects that the values of the "fine-tuned" variables only make sense in the context of the highly organizing effect of a set of remarkable laws and which together with those laws constitute pre-conditions which considerably enhances the chance of life. As I've said above, because of the huge space of possibilities open to intelligence and on top of that intelligence's inscrutability it is difficult to anticipate in advance what intelligence will do. But the reverse is an easier path. Given the works of intelligence we, as intelligences ourselves, can work backwards with a chance of interpreting the purpose of its works. To my mind all those dead worlds are the evidence of a search, reject and select computation, a declarative procedure that may well use teleological constraints.

The emphasis on fine tuning in Richard's quote above completely misses the plot; namely, that what is actually being fine-tuned is a remarkable cosmic computation machine of immense dimensions. And yet according to Richard's theology God simply doesn't do things like this; instead, God does things without logic and without sequence; it is ironic that Biblical literalists often think in a very similar way. But contrary to this kind of thinking is the evidence of our experience of the way intelligence works: Viz: It works using an experimental search, reject and select activity; the cosmos appears to be a tableau of intelligent activity, a tableau of creative activity.

And while I'm here a note to self: Here's a speculation for me to think about. The fine-tuning constants could have many, many non-zero decimal places after the decimal point. Therefore, if ordinary parallel processing rather than expanding parallelism is the search space method being used to develop the cosmos, the fine-tuning constants could be a sneaky way of feeding information, a priori, into cosmic evolution, thereby speeding the search up. 

***

Epilogue

In Part III I introduced the idea that the cosmos can be thought of as a fantastically large computation, a computation which is expressible in a very abstracted form as an equation relating the information content of the created configuration to a function of two variables: 1) The starting information and 2) the minimum possible number of computational steps. This equation looks something like this: 

I = S + Log T

Equation 1

Where I is the information content of the configuration created, and S is the minimum length of the algorithm needed to generate the configuration using a minimum number of execution steps of T. See here where I give more details on this relation.  (See also here). For a parallel computation the time taken for the computation will be proportional to T, but if as I feel is entirely plausible for our universe expanding parallelism is somehow being employed, the computation is achieved much faster. 

As we saw in Part III according to the theology of Richard Carrier, God, if he existed, would just do stuff abracadabra style; that is Richard takes it for granted that T ~ 0 and that creation has no sequential duration; in his theology God just does his stuff by downloading reified brute fact via his mighty magic commands. As we saw this is also the theology of the Biblical literalists (See footnote *1 below for the theology of the North American ID community). 

***

As I have said so often; there is a sense in which the elegant & succinct mathematical forms distilled from the high organisation of the cosmos "explain" absolutely nothing in the deepest sense of the word. Explanatory mathematical objects as we know them are less an act of explanation than that of compressed descriptions; as such they can never break the explanatory completeness barrier and deliver aseity. 

Our world is just one of the possible worlds that can be reified from the platonic realm. This fact is going to be hard to take for those who hanker after the secular notion that somehow the so-called material world can be so closed ended that it delivers an aseity of its own. Rather, it is just one of many possibilities that can be dragged out of the platonic world, reified and because of its organization, described with succinct "distilled" mathematical forms. It is in fact a work of art rather than a work of necessity; there is good art and bad art, but all is art, and art is but realized possibility. Our science gives us the pattern of the creation but not its fundamental origins; as many people have put it; the objects of science give us the "how" but not the "why?". But "why?" is only intelligible as a question in the context of an assumed a-priori sentience; in the context of this assumed conscious cognition the concepts of intention, goal and purpose have meaning. So, is our ravenous curiosity going to be satiated with answers that merely tell us about the "How"?  For some people at least that does seem to be the case. 

As we try to make sense of the cosmos we use a combination of induction, abduction and deduction: The generalizations of induction sometimes help prompt the production of theories but perhaps more often a theory is abducted with a giant intuitive leap of inspiration. Crucially, however, a theory arrived at by inspiration must then be tested via the predictions of deduction. This testing methodology has grown up around the relatively simple conceptual objects which control the physical regime, but it is a methodology that is far less effective when dealing with the inscrutabilities & complexities of the personal, the psychological, the sociological and above all the liminal world of the numinous. These phenomena are far too complex, erratic and full of exceptions to easily admit formal methods. The numinous in particular is the domain of anecdotal evidence, the domain of personal revelation

***

The evidence of our senses is that our cosmos is highly organized, and that this unique organization is such that it facilitates those descriptive conceptual devices and tokens we call the laws of physics which ride on top of and can be intellectually distilled from this order. That this order is being created and maintained everywhere and everywhen by an a priori intelligence is not an implausible proposition for many of us, even if for some it seems too large an epistemic step to make.  But I'll concede that it is not a proposition that can be formally tested like the relatively simple physical regime can be tested; testing such a complex entity is more akin to testing the partially veiled and complex world of sociology and human thought. So, although individuals may feel they have tested their faith anecdotally the anecdotes they tell won't convince everyone, least of all the evangelical atheists. But we do have this: Theism has the potential to at least make sense of the cosmos in terms of purpose and meaning whereas vanilla science, which only tells us the "how", cannot do this.  Moreover, as an idealist I would contend, that the reality of the particulate cosmos is unintelligible unless one first posits an a-priori up and running conscious cognition. Particulate matter only makes sense as the mathematical constructions of a conscious, thinking & perceiving sentience. For me Hebrews 11:3&6 is a necessary first principle of epistemology.

But of course, I can't expect an evangelical atheist like Richard to agree with any of this as it is very much dependent on personal anecdote rather than formal observational protocols. All I can advise is that people like Richard will just have to get out on their bikes and find some anecdotes of their own. As far as I'm concerned, all bets are still on!

***

Depending on how I feel I might complete this series by looking at Richard's tongue in cheek theology which he expresses in the picture that heads this post. Viz: God needs blood to fix the universe, but only his blood has enough magical power to do it, so he gave himself a body and then killed it. I wonder where Richard got his grist to come up with that one? I just wonder. The guilty parties probably know who they are.



Footnotes

*1 On North American Intelligent Design (NAID): Although I'm fundamentally an Intelligent Creation person I must once again disown any intellectual sympathy with this community, especially so as they fall into the welcoming embrace of the far-right, merging Christianity with politics. 

I personally don't have any intellectual commitment to the engine driving evolutionary change as currently conceived and yet I would heavily criticize the line taken by the NAID community: They have entrenched themselves in a tribal culture which is married to a set of misleading conceptual cliches: Viz: anti-evolutionism, "blind natural forces", anti-junk-DNA, "chance vs necessity" and subliminal deism. (See here for more). The NAID community make a sharp distinction between so-called "blind natural forces" and intelligent activity. The consequence is that they have adopted an epistemic filter which makes hard going of the identification of the basics of the physical regime as a work of hyper-intelligence; thus, in a sense chiming with Richard Carrier's view that the physical regime is a product of mindless blind Kaos; how utterly ironic!

If we assume that the cosmos is created and maintained everywhere and everywhen by the Divine will, then immediately the NAID category of "blind natural forces" becomes problematical. This is because in the context of intelligent creationism those forces can hardly be classified as blind and natural; in fact, the cosmos as the reification of artistic possibility rather than of necessity is highly unnatural. Although the NAID community are by and large like myself old cosmos creationists they nevertheless have subliminally taken on board the category of God as a super-duper conjurer creating stuff instantaneously as fully formed configurations, stuff that just springs into existence like a rabbit out of a hat. If this statement of their views is caricatured and unfair they had better tell me why it is. 

The particularly North American notion of God as a magician appears to be associated with the view that somehow the T term in equation 1 classifies as a "natural force" and therefore we must have T ~ 0. For them admitting T >> 0 is an intolerable bogy that is shockingly close to admitting some kind of evolution; to them it is the evil thin end of the "natural forces" wedge of secularization.  But in my opinion for the Everywhere and Everywhen God T is just as much a divine creation as is S


*2 Footnote: Falling into the linguistic trap of "nothing":

Richard tells us this: 

Why Nothing Remains a Problem: The Andrew Loke Fiasco • Richard Carrier Blogs

 What I showed is that once you actually allow for there to be nothing—nothing whatsoever—then a quasi-infinite multiverse is the inevitable, in fact unstoppable outcome. Because removing all barriers to what there can be or what can happen entails allowing all potential outcomes an equal chance at being realized (given only a single constraint: that logically contradictory states have a zero probability of coming to pass). There is nothing there to prevent that, nothing around to keep “nothing” a stable absence of everything. “Nothing” is, by its own defining properties, unstable.


That's not how probability works. Probability isn't a dynamic capable of generating something from nothing: it is about the level of observer information. Moreover, the physics of probability is about describing random patterns and not about the "instability of nothing". Probability and randomness are in no way an argument for the impossibility of "nothing"; trying to use them to generate aseity is well beyond their scope of usage. 

I've seen similar misinterpretations of the Uncertainty Relationship: As Richard is doing here, the principles of probability and randomness are glorified by raising them to the level of a kind of transcendent god-like dynamic or propensity capable of at least creating randomness from nothing. They don't see randomness as being only the mathematical description of a class pattern we meet in the universe rather than being a transcendent creative dynamic.

Another point: The principle of equal a priori probabilities concerns human information levels. That in itself isn't a sufficient condition that automatically translate into reified patterns of randomness.

Monday, August 05, 2024

Examining Mr. R. Carrier's use of Bayesianism. Part III


My apologies for having to display this theology!

This is Part III of my series where I'm looking at the following post by a Mr. Richard Carrier...

Why the Fine Tuning Argument Proves God Does Not Exist • Richard Carrier Blogs

See here for the other parts: Part I & Part II. In the last part of this series, we left Richard wanting to take cognizance of all the evidence relevant to the question of the origin of the creation's "fine-tuning" constants.....

***

RICHARD: The real problem here is that this leaves out pertinent evidence. Because we are here testing two competing hypotheses to explain observations: either (A) chance accident produced that alignment of constants or (B) someone or something intelligently selected them.

MY COMMENT: As we saw in part II this statement of the problem isn't coherent; the big question about so-called fine-tuning isn't just confined to a few constants, but a highly improbable physical regime (calculated unconditionally) governed by unique organizing principles of which the so-called fine-tuning constants are just one aspect. Richard should be asking if these principles are a chance accident, and if so that takes us into the question of whether there is an infinite sea of randomness out there of which our highly organized universe is but a very, very tiny corner of chance occurrence. But as we saw in the last part there is no evidence for our observable cosmos being an unimaginably tiny part in a random superverse, what you might call the "Big R" hypothesis.

Anyway, here's Richard's conclusion to his question and it is clear that his theological assumptions as to how "gods" are supposed to work drive this conclusion (my emphases)....

***

RICHARD: So when we bring all the pertinent evidence back in, the evidence indicates support not for Theory B (intelligent design), but for Theory A (chance accident). Fine Tuning is therefore evidence against intelligent design. It could never be evidence for it, because gods don’t need fundamental constants at all, much less all the weird ones we have. No intelligent agent needs quarks or electrons or electromagnetism or even gravitythings can just behave as commanded or designed: where things need to fall, they just fall; where stars need to shine, they just shine; where things need to stick together, they just stick together. One might respond that, still, it is possible an intelligent engineer would choose all these weird and unnecessary ways to create and sustain life. But that is fully accounted for here. What matters is not whether it’s possible. What matters is how probable it is. 

Because: If (a) we exist and (b) God did not design the universe, then (c) we should expect to observe several things, and lo and behold, those are exactly the things we observe; yet we do not expect to observe those things if God did design the universe. By definition that which is expected on x is probable on x; that which is unexpected on x is improbable on x. So if the evidence is probable if God does not exist and improbable if God exists, then that evidence argues against God, not for God.

Hence what matters is not what’s possible. What matters is its relative probability. In the case of Theory A, the probability of all these observations (the vast age, the vast size, the vast quantity of lifeless content, the vast lethality of the universe; and the bizarrely long, meandering, particular way life arose and developed into observers asking these questions) is essentially 100%. And you can’t get “more” than 100%. It’s as likely as likely can ever be. These observations are therefore maximally probable on Theory A. By contrast, none of these observations are at all expected on any plausible theory of intelligent design. Indeed, they are on Theory B predicted not to be observed.


MY COMMENT: In the above I have no issue with the core idea of conditional probabilities; namely that the probability of an outcome can be considerably enhanced if the conditions x or evidence x implies that it is a favored outcome.  Where the issue lies is with Richard's rather subjective assessment of what constitutes favourable evidence and/or conditions for his atheism. Take a look at the following......

In my short monograph on Forster's and Marston's (F&M) application of Bayes theorem to the question of God's existence I interpreted their use of probabilities in frequentist terms (itself a debatable maneuver) using this Venn diagram: 

Here the overwhelming number of cases favouring a habitable cosmos represented by "H" are found among the cases where there is an intelligent creator represented by the area "G".  If one is to accept this diagram (debatable!) it is then a trivial Bayesian calculation to show that given conditions/evidences "H" then it implies that the probability of God is almost unity. 

Now let's do the same for Richard's take on the situation. Interpreted in frequentist terms, he's saying this:

Richard's view is that our cosmos, with its huge volume of space-time sterile to life, can barely be claimed to be habitable and moreover to him the cosmos seems all very random; hence for Richard our cosmos lies somewhere in the region above labelled with "R".  According to Ricard, then, our cosmos is hardly the sort of affair that an intelligent and wise designer would create and therefore in his assessment region R has a very small overlap with region G. So, given this assumption of his, it then trivially follows that conditions/evidence labeled by "R" imply atheism with a probability of all but 100%. 

Clearly then F&M and Richard draw opposite conclusions from the evidence of the cosmos.  And it's not as if F&M, although Christians, are out and out antievolutionists after the manner of the right-wing North American Intelligent Design (NAID) community.  But let me say this: As we saw in Part II Richard seems to have underestimated the miracle of organization that is our observable cosmos and over emphasizes the role of randomness. I see the habitability of the cosmos, evolution or no evolution, as a very big deal indeed and not just a fluke of unadulterated randomness; I see Richard's vision of a bizarro "Big-R" superverse as not only hopelessly meaningless, but also lacking evidence. So, although I have to admit to feeling rather insecure about the use of Bayes to God, whether I was atheist or theist, I'm probably more on the side of F&M than I am that of Mr. Richard Carrier. 

As we will see below Richard's argument is based on his a priori theological conceptions and what he thinks (wrongly as it turns out) the way engineers who create stuff should work (my emphases):


***


RICHARD: Intelligent engineers aiming to create life don’t make the laboratory for it vastly larger and older and more deadly than is required for the project. Indeed, unless those engineers intend to convince that life that they don’t exist, they don’t set up its habitat to look exactly like a habitat no one set up. This is the least likely way they would make a universe. But set that point aside. The conclusion already sufficiently follows from the first point: there is no reason to expect God to have made the universe this way. It cannot be predicted that this is what a God would produce, or that it is what he would want to produce. Whereas it is exactly 100% predicted to be what we’d see if there was no God. So no matter what you try to propose, you can never get that probability to be 100% if there was a God. You can propose all sorts of excuses, all sorts of “maybes,” but you will never be able to prove those proposals to be 100% certain to be true. There will always be some significant probability that those “excuses” simply aren’t true, that God simply doesn’t have your imagined motives or limitations. And indeed, when there is no evidence for or against any one such motive or limitation, its probability simply is 50%. It’s as likely as not.


MY COMMENT: Well, OK I can accept that Richard should draw parallels between divine creation and what he thinks human engineers do in the act of creation. After all, human engineering is something we have experience of; where else do we get our evidence from? We can only use our experience, and any intuitions based on that experience to probe the question of a divine intelligent designer and creator.

But when Richard says above that the kind of universe we see is exactly 100% predicted to be what we’d see if there was no God. is that actually true? 

As I've already said in Part II the cosmos, whether current theories of evolution are correct or not, is a remarkable piece of work that is far, far from the "Big-R" that Richard gives every impression he thinks it is; it is in fact a highly organized system of surprising contingencies, organized contingencies of very low statistical weight and therefore of very low unconditional probability. So, unless we are rather taken with Richard's Bizarro Big-R superverse concept, the observable cosmos is a most singular and arresting piece of construction.  But just how was it created if it is not part of a Big-R superverse? Let's see....

***

Some years ago I was reading a book by a rather foolish fundamentalist Biblical literalist and I read these lines: 

.. the Bible teaches that the stars were created in an instant of time at the verbal command of God (Psalm 33:9). It is an awesome thought that God needed only to speak a word and billions upon billions of stars instantly appeared." (p15)

"... God supernaturally and instantaneously created the stars on the fourth day of creation" (p24)

"When we read of God's supernatural and instantaneous method of creation we must stand in awe of Him." (p34)

"When we consider God speaking the vast Universe of stars into existence, we can do nothing but stand in awe of Him"

This Biblical literalist is quite sure he knows the vital property distinguishing "natural" processes from "supernatural" action - it is of course that creators create their creations instantaneously by means of the pronouncement of suitable magic words. In commenting on Proverbs 8:27-30 where we read about God invoking wisdom as the craftsman of creation he concludes "God did not use evolution because a craftsman carries out instantaneous and deliberate actions whereas evolution involves a long random process" (p31). However, there are two glaring errors here: 1) Craftsmen don't create instantaneously. 2) To call evolution "random" is a gross misrepresentation. This literalist is captive to a false dichotomy: Viz: He contrasts what he believes to be the very random processes of evolution with what he feels are the instantaneous and deliberate acts of the craftsman. The irony is that Richard's views in terms of the concepts he employs aren't a lot different: As we've seen he is impressed by the notion of a Big-R universe. Moreover, take a look at the following which I've already quoted from Richard in a previous section of this post....

No intelligent agent needs quarks or electrons or electromagnetism or even gravitythings can just behave as commanded or designed: where things need to fall, they just fall; where stars need to shine, they just shine; where things need to stick together, they just stick together.

That is, in the mindset of both our Biblical literalist and Richard Carrier divine creation should entail no underlying logic, no process and no history; things just happen just-like-that, abracadabra style. Basically, the caricature of divine creation conceived by both Richard and our Biblical literalist is that the act of creation is brute magic.  Richard and our literalist just can't conceive that God might use the resources of time and space (humanly speaking huge amounts of them) as a demonstration of the process and computational cost needed to create life. For them God is a magician who merely commands stuff into existence and Richard's theological notions in terms of concepts employed doesn't look to be a great advancement on the Biblical literalist.

We cannot but help notice that our Biblical literalist is as laughably wrong as anyone can be about the actions of a craftsman; those actions are certainly not instantaneous; if they were we might justifiably accuse the craftsmen of being magicians in league with Devil. In fact, in some ways the work of the craftsman resembles the inconceivably more sophisticated work in the womb; that is, a stage-by-stage process moving incrementally closer to an end product as time progresses. These stages proceed against a background of inherent dependencies, e.g. a craftsman can't make a silver candlestick until some silver has been smelted and an embryo can't develop without a union of the appropriate genetic components not to mention the underlying organic chemistry fundamental to all living things. Of course, it is easy to claim that an omniscient omnipotence could create in one grand slam instantaneous act a fully mature human, but the sequential dependencies I talk of here are conceptually fundamental. A silver candlestick depends on the existence of silver but silver is not obliged to exist in the form of a silver candlestick. Likewise, humans depend on a prerequisite organic chemistry which itself depends on more fundamental conditions such as the construction of atoms. There is a forced logical sequence here that we cannot escape from whether we believe in instantaneous creation or not. If God instantaneously created a mature object that would not detract from the fact that the object itself may have inherent sequences of logical dependencies.

Some concept of sequence, then, is built into things no matter how they are arrived at. But the sequencing we see in embryo growth and artifact construction is much stronger than this "dependency" sequencing. Both processes pass through a series of stages separated by increments. Each stage is usually a little closer to the final product; although this is not necessarily true in the case of the craftsmen art where sometimes a search for solutions means backtracking may occur. But the fundamental aspect of both is the incremental separation between stages. The end product is the result of an accumulation of these incremental changes. The common theme is that of a quasi-continuity of change; you pass from one state to another through a series of intermediate states, thereby forming an incremental sequence of change. I would not, however, want to use the generic term "gradualism" here because some processes like, say, an explosion, is both incremental and yet very rapid. The key notion is one of at least an approximate continuity of change in as much as successive stages are only separated by relatively small displacements.

But we must take our faulting of both Richard and our Biblical literalist yet another stage further. As we know the process of designing is also a "search" process, an experimental trial and error endeavor that in some cases has definite goals in mind and in other cases involves chance discoveries that are perceived to have utility and only then are selected to become part of the technological tool kit. There is also the complex cognitive thought process occurring in the mind of the designer, which although not visible are all part of the experimentalism as ideas are mulled over in the mind and either rejected or selected for realized reification in material technology. 

All these factors combine to give us an exponentially branching network which constitutes a potentially huge search space making the space-time of the observable cosmos look like a very tiny place indeed. But the search space is considerably reduced if the creator is primed with an informational head start; that is, if the creator has useful a priori knowledge. The form of the equation which relates the information content of the configurations created as a function of starting information and the minimum possible number of computational search steps looks something like this: 

I = S + Log T

Equation 1

Where I is the information content of a configuration arrived at, S is the minimum length of the algorithm/knowledge needed to generate the configuration using a minimum number of linear execution steps of T. See here where I give more details on this relation.  (See also here).  This relation tells us that a creative agent/process can take a lot less time if that agent has a large amount of primal starter information S. But assuming a parallel processing paradigm then when S is lacking content information in I is generated only very slowly with Log of the number of execution steps T

It is a strong theological intuition that a proper concept of God entails an omniscient being and therefore One who has a full quota of S and hence has little need of the generation steps T. I guess that it is this intuition which influences our fundamentalist and Richard both of whom are quite sure that when it comes to creation brute omni-power means that God can just do stuff all but instantaneously and doesn't need any process with a history behind it; that is T ~ 0.  But of course, that's not true of human designers for which the cognitive process of design and creation entails thought, sequence, experiment, and the trial-and-error search for good information all of which is, above all, a process with a history. In that sense both Richard and our literalist fundie have got it so wrong about designers; designers search. test, reject & select, backtrack, correct, and develop; they don't just do stuff instantaneously but rather leave behind a history of research & development; history, and plenty of it, is implicit in all human artifacts. 

***

As we know our own universe displays a history; it is an object which has developed and didn't spring into existence "just like that". It is this history which biblical literalists are committed to denying with great scientific difficulty. Of course, Richard, like myself, believes in cosmic history, but he's trying to push past us the theological notion that theists should all be like Biblical literalists and postulate T ~ 0 where God stuff just falls into place at command; he is asking me to accept that creation should have little or no algorithmic logic and history behind it and he is also asking me to accept that a cosmos with logic and a long history is evidence that God doesn't exist. Moreover, as we've seen he gives the impression that he's positing a "Big R" superverse. But that, as we've also seen, has little or no evidence going for it and can be justifiably called a bizarro universe as it can be used to explain anything. And while on the subject of bizarro explanations: To me, the concept of the abracadabra God, a concept shared by Biblical literalists, also qualifies as a bizarro God because just about anything goes; see for example my "Beyond Our Ken" series. In fact, it may be that much of Richard's theological concepts stem from his experience with the North American Intelligent Design community and fundamentalist organisations like Answers in Genesis.

For myself the Big R supervesre is as unlikely as the abracadabra God. Neither notions have evidence in their favour; Big-R predicts instabilities in the organisation of the cosmos, instabilities we don't observe, and abracadabra predicts a universe without a logical history, a universe Beyond our Ken.

It is clear our universe has a history of development and this is particularly evident with life and geology, although may I say that I'm not committed to any particular engine/mechanism of evolution. However, I would tentatively submit the idea that the size of the creation is a divine revelation to humankind of the computational costs of a universe such as ours, a universe which is so obviously specialized for developing and supporting life. Another speculative notion which I would like to submit is that our universe may well use expanding parallelism and teleological constraints in order to generate life; this would get rid of the "slow" Log T term in equation 1 above, an equation which pertains to vanilla parallel processing. However, all that is very speculative, and the last thing I want to do is to be like Richard and our foolish literalist fundie who have made their minds up and think everyone else should follow suite, or else be called nasty names by them and their followers. 

Lastly let me comment on this quote from Richard: 

Intelligent engineers aiming to create life don’t make the laboratory for it vastly larger and older and more deadly than is required for the project...etc etc,

That may well be true. But the minimum space-time dimensions of a cognitive "laboratory" depend entirely on:

a) the initial knowledge of the engineers, that is the value of S, and 

b) the configurational complexity of the task in hand which dictates the minimum value of T given S

So, if the level of providence the Good Lord has provisioned our universe with measured in terms of its initial algorithmic complexity (S) and the time and space set aside for cosmic development (T), then the incredible sophistication and complexity of life very likely dictates the large space-time dimensions of our cosmos.  From where I stand the cosmos looks very much like an ingenious piece of computational engineering built around equation 1 above. It certainly isn't a tiny piece of an immense Big-R cosmos, or something created last Thursday with a built-in bogus maturity (The omphalos hypothesis).

In part IV I will continue to examine Mr. Richard Carrier's theological assumptions. 

Sunday, December 17, 2023

Does God Exist?: Hendricks vs Myers

 




I was interested to do a first parse of the above debate on God's existance with theist Perry Hendricks and evangelical atheist PZ Myers.  If time permits, I might do a more detailed commentary on this video but here are some initial comments.

Much of Perry Hendricks' argument was based on the Bayesian type reasoning which uses priors like the existence of cosmic design, organization, biological structures and human moral instincts as evidence for God. These arguments have a generic form which employs Bayes theorem to derive a high probability of God's existence. I considered an example of this class of argument here: Bayes and God. He also used the cosmological argument; Viz: Because the natural world is shot through with contingency and cannot be the seat of Aseity or the realm of explanatory completeness, Aseity must exist beyond the material world and must be the ultimate cause of the hard core of cosmic contingency. Hendricks is a bright guy and is a credit to the faith.

PZ Myers dismissed all that without further ado as just philosophy and therefore not worth further consideration. PZ made it quite clear he is looking for a God he can test like he can test a mechanical system such as a chemical reaction: i.e. Press button A and you get output B. He's looking for a God of quick tricks and the example he gave is this: Can God tell me what I've got in my pocket? If God can't rise to that simple test, then it is unlikely there is a God, although to be fair PZ admitted that no one can answer the question "Is there a God?" either way with absolute certainty. I'd agree there is no human certainty and I have some sympathy with atheists who feel that a world like ours can't be a result of a personal, loving and infinitely wise Creator; just think of Ken Ham, Alex Jones, Margorie Taylor-Green, Donald Trump & QAnon promoter Trey Smith and you've got some evidence for atheism.  But as for providing some tricks for PZ, you never know: After all God is a God of grace! What PZ didn't seem to twig is that underneath it his reasoning was Bayesian! How ironic! The further irony is that those Christians who say they know God exists because they have God in their hearts, are also using Bayes without knowing it!

Sunday, August 13, 2023

North American Intelligent Design's response to my 27 June & 2 July posts. Part 2


The default thinking of the North American ID community leaves us feeling that it's a choice between Intelligent design and Evolution. But that science's accounts are effectively succinct descriptions which exploit cosmic organization means that the question is not a choice of opposites as the NAID community and some atheists make out. In one sense there is no such thing as "natural forces".

 

In this post I'm going to comment on the following post on "Evolution News" by Eric Hedin:

Physics, Information Loss, and Intelligent Design

This is a continuation of my last post. which followed up the posts here & here. It addresses in particular Eric Hedin's claim to what he believes to be a generalized version of the second law of thermodynamics based on quantum decoherence. 


ERIC HEDIN In an earlier article, I showed that information ratchets do not exist in nature. The most that any mechanistic system can do is to reproduce the information already available within the system. Printing presses reproduce the typeset information placed in the mechanism by human operators. ChatGPT simply accesses and rearranges information originated by humans and uploaded on the Internet. No new information is produced in either case.

In a recent article, I introduced the physical concept of the generalized second law of thermodynamics, as a governing principle consistent with the Law of Conservation of Information, which William Dembski formulated with the claim that natural causes cannot increase complex specified information in a closed system over time. Here, I’ll seek to provide an explanation of the physics behind the generalized second law — a rationale for why natural processes destroy information.

MY COMMENT: As we saw in my last post Hedin did not succeed in showing that information ratchets do not exist in the created order; part of his problem was that he didn't tell us what he meant by "information". However, in the above it looks as though he's thinking of the so-called principle of the "conservation of information". Accordingly, he's of the opinion that unless the mysteries of intelligent agency are invoked God's creation can't create information (although Hedin believes information can be destroyed). Heuristically speaking this rule of thumb often works, but it is not always true; that is, as a catch-all fundamental principle the conservation of information is false as we shall see.

The mechanistic systems we are familiar with do in fact create information in the compelling common sense meaning of the term. A variety of natural systems create complex chaotic patterns whose elements are new to the cosmos and what would by any common sense meaning of the term be new information to human beings. Systems that generate random sequences are by the Shannon definition of the term creating information all the time (See part 1).

It is misleading to claim that computer systems merely rearrange information: They can in a very compelling sense do far more than that: Starting with the relatively simple pattern of an algorithm and given enough time and space very complex patterns can be generated. As far as the cosmos is concerned these patterns may well be entirely new forms: that is, new information. Trivially it could be claimed that for deterministic algorithms the information eventually generated is implicit in the starting conditions: But because any given pattern has at least one simple algorithm which will generate it given enough time and space, then on that basis it would trivially follow that no finite pattern (including finite stretches of disordered patterns) would classify as new information! What we have then is a situation where, whilst it is true that something isn't coming from nothing, nevertheless a lot of something is coming out of relatively little. Ergo, mechanical and so-called "natural forces" create information. 

My conclusion is that it is a misrepresentation to claim that neither nature nor computers are simply "printing" information they already have and at best are tweaking its arrangement a little. But if we allow nature and computers to tweak information, then we can ask ourselves what would be the result of billions of tweaks? The result would in fact be completely new configurations reified from the platonic realm: That is new information, especially so if we are starting from bland and simple initial conditions and relatively small algorithms.

In deterministic generating systems, natural and computational, the information generated can be regarded as implicit from the beginning in the sense that it exists in unreified form in the platonic realm. Hence this implicit information needs to be reified before it can be claimed to be part the real world, and in that sense the information is being created. Natural systems and computer systems are a means doing this. But nothing comes from nothing and these information creating sources have their origins and continued sustenance in God and/or human intelligence. 

As I said in this post and proved here, in parallel computation information is created according to:

Ic = Smin + log (Tmin)

Where Ic is the configurational information content, Smin is the minimum length of the algorithm needed to generate the configuration with a minimum number of execution steps of Tmin. As I have said before it is the slow generation of information with time (The "log" term above) which has given rise to the opinion that information cannot be generated by natural and mechanical means. But the above relation only applies to parallel processing; if the generating system employs a system of expanding parallelism (e.g. a quantum computer) such exponential systems make short work of the log term, a term which then becomes a term linear in time. 

I will be examining Hedin's so-called generalised second law of thermodynamics below. This proves to be nothing but hand-waving. 


ERIC HEDIN: What about the less physical concept of information? How can we physically explain the relentless loss of information by natural processes? Information seems to be a nonphysical concept, but in our universe, it is stored in specific arrangements of physical states of matter. An intelligent mind can recognize specific arrangements of matter (such as molecules of ink that form letters on a page) that convey a meaningful message. In a different context, biochemists can recognize particular sequences of nucleotide bases in a genome that code for a functional protein.

MY COMMENT:  Firstly, as we saw in part 1, we need not talk vaguely about arrangements of matter which an intelligent mind can recognize as meaningful; Hedin is hand-waving here. The configurations of matter we are thinking of have a very practical meaning: Viz: they are ordered systems which are capable of self-maintenance and self-multiplication; in that sense the functionality we are thinking of is a clear concept. However, one thing we can agree on is this: Such configurations are at once both complex and highly organised;  which means that as a class those self-perpetuating organic structures have a very low statistical weight and therefore a very low unconditional probability. If our cosmos is only a parallel processor, then such configurations would never come about in the life-time of the cosmos if naked chance ruled. To enhance the probability of such configurations to a practical level would require the contingency of the right kind of physical regime to raise the conditional probability of life to a practical level. Where parallel processing is the norm the conservation of information does approximately hold. 

My affirmation that the conservation of probability is a good approximation in a parallel processing context doesn't necessarily come to the rescue of the NAIDs. The irony here is that the possibility of an omniscient omnipotent Creator/Designer leaves them with an unknown. For we couldn't put it past an all-powerful all-knowing Deity to not have provisioned the cosmos with enough information to generate life with a high probability. As NAID William Dembski himself has shown, the conservation of information doesn't of itself rule out standard evolutionary gradualism. We would be very presumptuous to be absolutely certain that the cosmos hasn't been divinely provisioned with the prerequisite information to generate life. But because the NAID community affect to keep up the gloss of being a scientific society and not a theological society their crypto-theism makes them uneasy in admitting the implications of Christian theism which posits at the outset, a priori, the existence of an all-powerful all-knowing Deity: The implications are that some kind of evolution may well have been reified from the platonic world into our cosmos.


*** 

 In my first post of this two-part series, I wrote:

Consider this writer [Eric Hedin] on the NAID website "Evolution News"; he dreams of a principle (in fact he thinks it's been found!) which he refers to as the generalized version of the second law......

In Hedin's first post he doesn't explain this generalized version of the second law but in the post we are considering here he does attempt to explain it as follows:

ERIC HEDIN: According to the traditional second law, under the influence of natural processes, the surrounding environment brings about a transfer of heat from hot to cold, or a mixing of constituents, such as the mixing of molecules of perfume throughout the air of the room. Natural processes will also cause a mixing of information-bearing physical objects with the environment. In quantum computing research, this loss of information to the surrounding environment results in what is known as decoherence, meaning that “information has leaked into the environment in an uncontrollable fashion.”

Linking Information and Observer: All the information that can be known by an observer about a system of any kind is contained within the quantum mechanical wavefunction of the system. My apologies for bringing up quantum mechanics, but its relevance here is that it serves as the link between the information of a system (anything from a single atom to a complex biomolecule to a macroscopic object) and an observer. Unless the wave function of the system is completely isolated from any environmental influence, it will suffer decoherence (loss of information) with the passage of time. In one sense, the wavefunction spreads out into the environment, meaning that the observer will have greater and greater uncertainty as to the state of the system as time goes on. The physical interaction of atoms or photons uncompromisingly causes this effect, with its resulting loss of information.

 MY COMMENT: Undisturbed quantum systems settle to an eigenstate. An eigenstate is a kind of equilibrium or quasi-static wavefunction with a precise energy; it's analogous to a volume containing a gas which eventually settles to a uniform & precise density and pressure. But when a quantum system in an eigenstate is disturbed (or "perturbed" as it is usually expressed) by interaction with a thermodynamic environment the "coherent" eigenstate becomes mixed with other possible eigenstates: The wavefunction then loses it stasis and becomes a mix of wavefunctions and is no longer stable; to use the standard terminology the original static wavefunction "decoheres".  This decoherence is a particular problem in quantum computing which depends on the stasis of "qubit" eigenstates. 

There is an analogy here with a classical system of two ideal non-thermodynamic bodies orbiting one another in a perfect vacuum under the influence of classical gravity: For an ideal classical system with no thermodynamic randomness such a system will continue in the same state forever. But in the real world any ideal classical system is likely to be coupled to the "imperfect" world of thermodynamics and this linkage would mean the "ideal" system would then undergo perturbations and would randomly walk into another state, if only slowly; from a human point of view the error bars of the unknown then start to widen. These error bars of human knowledge will continue to expand in the absence of information updates that pull us back into the know and keep the Gaussian error envelopes in the vicinity of the actual state of the system. 

The thing to note here is that we have assumed thermodynamics from the outset: Likewise the behaviour of  quantum systems when in contact with a thermodynamic reality tells us no more than we already know; namely, that the world is thermodynamic and that if you want to preserve the integrity and of an array of  binary eigenstates (= qubits) and stop the array  becoming a mix of other eigenstates  by keeping it isolated, you've got your work cut out: Like all real systems a qubit system loses  its ideal state when in contact with the thermodynamic reality around it. 

But decoherence doesn't stop crystals forming or stop the reproductive systems of life working or stop the self-perpetuating systems of life from doing their job. The reason why is because these systems have the physical analogue of information updates that keep these systems within the requisite orbits of self-sustenance: The physical analogue of those information updates are the potential wells or interactional hooks/forces which have a strong tendency to keep ordered systems in place. 

In the above quote Hedin has talked about how a quantum system's interaction with a thermodynamic environment will entail a slow degradation of an observer's knowledge of that system. This is a misrepresentation. In fact, if the observer knows the form of the interactional perturbation, he can then infer how the new mixed wavefunction develops. So, as a consequence of a known perturbation the wavefunction has passed from one known form that is a single eigenstate to another known form that is a mix of a large number of eigenstates. 

All waves states, whether mixed or eigenstates, obey the uncertainty relation:


The square of a momentum value returns an energy value; if this energy value has the precise value pertaining to a single eigenstate then it implies that the magnitude of the corresponding momentum also has a precise value; that is the uncertainty in momentum magnitude is minimized. Therefore, from the uncertainty relation above it follows that under conditions of precise energy the uncertainty in position is maximized. So, when an isolated quantum system interacts with a thermodynamic system its momentum becomes more uncertain, but its position is more localized: In short, increasing certainty in position is traded for greater momentum/energy uncertainty. The lesson is that as a result of an interaction with a thermodynamic system loss of information about the exact energy of the quantum system is compensated by a gain in the information about position. In Hedin's quote above we see that to suite his polemic he has only talked about "loss of information" in energy/momentum but not balanced it with the corresponding and complementary gain in information about position. 

Pure quantum systems are reversible and don't provide Hedin with a fundamental generalized version of the second law: That is, they cannot provide us with that well-know thermodynamic arrow of time where total entropy always increases. It is only the giveness of thermodynamic laws which provide the arrow of time. Moreover, as I've already remarked, in his reference to the loss of information from quantum systems Hedin has neglected to tell us that this is accompanied by a corresponding increase in information. Hedin is simply telling us something we already know about; namely, the effect of thermodynamic systems on quantum systems, systems that are constrained by the complementary uncertainty relation which trades information about one variable for information about another. 

So, there is really nothing at all startling with what Hedin has to tell us. It changes nothing and certainly doesn't provide us with a more fundamental second law than the one we already have. A clue as to the real polemical motive behind Hedin's argument is here:  "My apologies for bringing up quantum mechanics": In bringing up quantum mechanics Hedin has muddied the waters. In that sense it's reminiscent of fundamentalist Jason Lisle's ASC model of cosmology by which he side-steps the age of the universe and like Hedin's reference to quantum mechanics it will confound their benighted followers in their respective communities, especially those who depend on the community's gurus for instruction; it's technical bafflegab which will wow the uninitiated NAID rank and file.


ERIC HEDIN: Some might argue that “luck” could result in an opposite outcome, with interactions causing an increase in information (in biochemistry, this would correlate with increased functional complexity). Why couldn’t this happen? Simply because there are always more ways to go wrong than to go right, when considering whether interactions will result in chaos or increased complex specified information.

MY COMMENT: Yes I think we can go along with that; But let's remember again: The cosmos is no ordinary  "natural" object: It's unwarranted and seemingly out of place contingency is the creation of an a priori transcendent intelligence of unimaginable power and Asiety; who can guess what those so-called "natural forces", created, managed & sustained by such an intelligence might achieve in terms of their ability bring forth organisation?


ERIC HEDIN: An increase in information requires not just one right choice (or lucky draw), but a long sequence of correct choices. Luck might happen once, but any gambler knows that if “lucky outcomes” keep happening against the odds, then the game is rigged. A “rigged game” in nature corresponds to a law of physics — in this case, a law causing information to increase over time by natural causes. Such a law cannot really exist, however, since we already have a law of nature that says the opposite. As I mentioned in a recent article, “Theistic Cosmology and Theistic Evolution — Understanding the Difference”:In our study of science, we have found that the laws of nature do not contradict one another. We don’t have laws of nature that only apply piecemeal.

 MY COMMENT: By any common sense standard and by the Shannon definition of the term both "natural" processes and human designed computation machines increase information  as we have seen. I agree with the above paragraph up until and including A “rigged game” in nature corresponds to a law of physics — in this case, a law causing information to increase over time ........   But because of the poverty of his community's conception of nature Hedin starts going astray as soon as he talks about so-called natural causes. As we have seen he hasn't discovered any "natural law" that contradicts evolution least of all a generalized second law of thermodynamics based on decoherence. His efforts here amount some hand-waving around the idea of an observer becoming less and less certain of a system that as a result of its contact with its thermodynamic surroundings is subject to random walk. He then conflates an observer's information with the objective uncertainties of quantum mechanics, when in fact there is a distinction between the state of an observer's knowledge of a system and the objective state of a physical system itself which according to quantum theory has its own uncertainties; but these uncertainties are not to be confused with observer uncertainties. 

Hedin's argument is no advance on the standard second law of thermodynamics, a law which only bars overall entropy decreases in isolated systems without putting a fundamental bar on local increases in the order of subsystems within the overall system.  This doesn't mean to imply that evolution as conventionally understood has actually happened, but it's clear that Hedin and his colleagues continue to fail to find a fundamental law prohibiting evolution. The spongeam may yet exist and Hedin and his community have failed to see the difficultly of the problem that faces them if they wish to bar evolution on the basis of fundamental physics.

Nature is not natural: Its highly contingent laws, patterns and statistics have and are and will continue to be reified out of the platonic realm by the Divine Will. Yes, it is just possible that God is patching organic configurations  directly into nature ad-hoc style but the nested cladistics of life look to me as if some sort of divinely provisioned process is at work that at least looks vaguely like conventional evolution; perhaps via the conditional information of the spongeam or perhaps something more exotic in the realm of expanding parallelism and teleological laws - the latter would essentially entail  that the cosmos is a declarative system of computation. But Hedin and colleagues have in no way proved their case about those divinely provisioned "natural forces" being blind & ineffectual. Hedin has waved hands around and then simply asserts without proof the old NAID canard:

Such a law cannot really exist, however, since we already have a law of nature that says the opposite.

Hedin must try harder. 


ERIC HEDIN: Imagination and Freedom: Only by the action of non-physical intelligence can the natural process of decoherence and information loss be overcome. Information is meaningless apart from a rational mind, meaning the creation of new information requires more than knowledge. Increased information requires imagination and the freedom to creatively design complex outcomes that convey meaning or exhibit function. (See, “Intelligence Is Unnatural, and Why That Matters.”)

The non-physical aspect of our intelligent minds can succeed in producing information because an intelligent mind can imagine a meaningful outcome and act to separate the components of a complex system from their natural mixed state into specific arrangements that actualize that outcome.  This takes work, meaning it requires energy, but not energy alone, since without the guidance of a non-physical mind, energy cannot succeed in increasing information in a closed system.

MY COMMENT: Philosophically speaking I disagree with the typical dualist dichotomy of the physical vs the non-physical. Firstly God created one world (Colossians 1:15ff) and the only substantive dualism I would propose is God vs. everything else. But in any case what gives the so called physial world meaning is that its laws so perfectly organise experience and thinking that a rational solid world presents itself to perception: That is, the physical world is an unintelligible and incoherent idea without the a priori assumption of the presence of a conscious intelligence somewhere. So, yes, information is meaningless without positing a rational perceiving conscious mind. This idealistic philosophy sees the physical world is an aspect of mind and a world created by the Mind of God at that. So, we should not be surprised if that world displays "unnatural" and miraculous properties; science is less a way of "explaining" those properties than it is a way of describing those properties. It is mathematically inevitable that those properties will be highly particular and contingent. 

Clearly the so-called physical world is manifestly & intrinsically miraculous: Decoherence doesn't stop crystal formation and it doesn't stop organisms self-perpetuating and multiplying. These systems work because nature is carefully crafted by God everywhere and everywhen. Whether God has patched living configurations directly into nature over billions of years or has provisioned it with enough information in its laws and fine-tuning to evolve life, or has used an exotic declarative system with expanding parallelism and teleological constraints is the question that has yet to be unequivocally answered; it certainly has not been properly addressed by the NAID community. In fact, if they persist with their "intelligence vs blind natural forces" dualism they will continue to think there is nothing to address. 

Hedin's notion of natural and mechanical systems as printer-like-devices which simply passively pass on information is rubbish.  Mechanical systems and natural systems (which includes human beings) are constantly creating new forms; that is, new information. In spite of the constancy of reproductive templates each human individual is a unique creation where new information has entered the cosmos. That the mechanisms of the cosmos constantly bring forth new information out of the platonic realm should be no surprise to anyone who believes in a sovereign Christian God. But this is something those who affect to present themselves as a purely scientific community with a degenerate view of "natural forces" are loathe to admit. 


ERIC HEDIN: Intelligent design remains the only explanation consistent with the laws of physics for the increasing information content of living systems throughout life’s history on Earth.

 MY COMMENT: Notice here that Hedin makes a distinction between Intelligent Design and the laws of physics:Yes, I agree that an a priori  Intelligent Creator (complex enough to have Aseity) is the only way of making sense of the mathematically necessary contingent bias of the cosmos. In the light of this perception I'm not going to rush to conclude that the physical world is some kind of ineffectual demiurge creator only capable of creating chaos: In assuming their position the NAIDs have taken on board the mindset of our times which frames the question of origins as a God vs. impersonal physical forces dichotomy, an exclusive or between God and physics: In this polarised context atheists are anxious to prove the creative efficacy of those "natural forces" and conversely the NAIDs are committed to proving that those natural forces are "blind" agents.

The NAIDs have also committed themselves to the gnostic notion that human intelligence is a ghost in the machine and cannot be simulated by algorithmic means or be a particular application of the God provisioned material package. I think the reason for this aversion to the idea that humans are an application of the material package is this: If human intelligence is an aspect of the clever use of matter (matter which embraces those enigmatic quantum properties) and we accept that human intelligence is able to create information then it would clear the way for us to propose that other applications of matter, whether some kind of evolution or computation can also create information.

But I know of nothing that the NAIDs have said which means one should commit oneself at all costs to their dualist and quasi-gnostic views.  In so publicly opposing evolution on the basis of the gut-feeling that it elevates natural forces to a god-like demiurge status they have effectively swallowed atheistic categories and have dug themselves into a hole from which they seem incapable of escaping.