My apologies for having to display this theology! |
This is Part III of my series where I'm looking at the following post by a Mr. Richard Carrier...
Why the Fine Tuning Argument Proves God Does Not Exist • Richard Carrier Blogs
See here for the other parts: Part I & Part II. In the last part of this series, we left Richard wanting to take cognizance of all the evidence relevant to the question of the origin of the creation's "fine-tuning" constants.....
***
RICHARD: The real problem here is that this leaves out pertinent evidence. Because we are here testing two competing hypotheses to explain observations: either (A) chance accident produced that alignment of constants or (B) someone or something intelligently selected them.
MY COMMENT: As we saw in part II this statement of the problem isn't coherent; the big question about so-called fine-tuning isn't just confined to a few constants, but a highly improbable physical regime (calculated unconditionally) governed by unique organizing principles of which the so-called fine-tuning constants are just one aspect. Richard should be asking if these principles are a chance accident, and if so that takes us into the question of whether there is an infinite sea of randomness out there of which our highly organized universe is but a very, very tiny corner of chance occurrence. But as we saw in the last part there is no evidence for our observable cosmos being an unimaginably tiny part in a random superverse, what you might call the "Big R" hypothesis.
Anyway, here's Richard's conclusion to his question and it is clear that his theological assumptions as to how "gods" are supposed to work drive this conclusion (my emphases)....
***
RICHARD: So when we bring all the pertinent evidence back in, the evidence indicates support not for Theory B (intelligent design), but for Theory A (chance accident). Fine Tuning is therefore evidence against intelligent design. It could never be evidence for it, because gods don’t need fundamental constants at all, much less all the weird ones we have. No intelligent agent needs quarks or electrons or electromagnetism or even gravity—things can just behave as commanded or designed: where things need to fall, they just fall; where stars need to shine, they just shine; where things need to stick together, they just stick together. One might respond that, still, it is possible an intelligent engineer would choose all these weird and unnecessary ways to create and sustain life. But that is fully accounted for here. What matters is not whether it’s possible. What matters is how probable it is.
Because: If (a) we exist and (b) God did not design the universe, then (c) we should expect to observe several things, and lo and behold, those are exactly the things we observe; yet we do not expect to observe those things if God did design the universe. By definition that which is expected on x is probable on x; that which is unexpected on x is improbable on x. So if the evidence is probable if God does not exist and improbable if God exists, then that evidence argues against God, not for God.
Hence what matters
is not what’s possible. What matters is its relative probability. In the case
of Theory A, the probability of all these observations (the vast age, the vast
size, the vast quantity of lifeless content, the vast lethality of the
universe; and the bizarrely long, meandering, particular way life arose and
developed into observers asking these questions) is essentially 100%. And you
can’t get “more” than 100%. It’s as likely as likely can ever be. These observations
are therefore maximally probable on Theory A. By contrast, none of these
observations are at all expected on any plausible theory of intelligent design.
Indeed, they are on Theory B predicted not to be observed.
MY COMMENT: In the above I have no issue with the core idea of conditional probabilities; namely that the probability of an outcome can be considerably enhanced if the conditions x or evidence x implies that it is a favored outcome. Where the issue lies is with Richard's rather subjective assessment of what constitutes favourable evidence and/or conditions for his atheism. Take a look at the following......
In my short monograph on Forster's and Marston's (F&M) application of Bayes theorem to the question of God's existence I interpreted their use of probabilities in frequentist terms (itself a debatable maneuver) using this Venn diagram:
Here the overwhelming number of cases favouring a habitable cosmos represented by "H" are found among the cases where there is an intelligent creator represented by the area "G". If one is to accept this diagram (debatable!) it is then a trivial Bayesian calculation to show that given conditions/evidences "H" then it implies that the probability of God is almost unity.
Now let's do the same for Richard's take on the situation. Interpreted in frequentist terms, he's saying this:
As we will see below Richard's argument is based on his a priori theological conceptions and what he thinks (wrongly as it turns out) the way engineers who create stuff should work (my emphases):
***
RICHARD: Intelligent
engineers aiming to create life don’t make the laboratory for it vastly larger
and older and more deadly than is required for the project. Indeed, unless
those engineers intend to convince that life that they don’t exist, they don’t
set up its habitat to look exactly like a habitat no one set up. This is the
least likely way they would make a universe. But set that point aside. The
conclusion already sufficiently follows from the first point: there is no
reason to expect God to have made the universe this way. It cannot be predicted
that this is what a God would produce, or that it is what he would want to
produce.
MY COMMENT: Well, OK I can accept that Richard should draw parallels between divine creation and what he thinks human engineers do in the act of creation. After all, human engineering is something we have experience of; where else do we get our evidence from? We can only use our experience, and any intuitions based on that experience to probe the question of a divine intelligent designer and creator.
But when Richard says above that the kind of universe we see is exactly 100% predicted to be what we’d see if there was no God. is that actually true?
As I've already said in Part II the cosmos, whether current theories of evolution are correct or not, is a remarkable piece of work that is far, far from the "Big-R" that Richard gives every impression he thinks it is; it is in fact a highly organized system of surprising contingencies, organized contingencies of very low statistical weight and therefore of very low unconditional probability. So, unless we are rather taken with Richard's Bizarro Big-R superverse concept, the observable cosmos is a most singular and arresting piece of construction. But just how was it created if it is not part of a Big-R superverse? Let's see....
***
Some years ago I was reading a book by a rather foolish fundamentalist Biblical literalist and I read these lines:
.. the Bible
teaches that the stars were created in an instant of time at the verbal command
of God (Psalm 33:9). It is an awesome thought that God needed only to speak a
word and billions upon billions of stars instantly appeared." (p15)
"... God
supernaturally and instantaneously created the stars on the fourth day of
creation" (p24)
"When we read
of God's supernatural and instantaneous method of creation we must stand in awe
of Him." (p34)
"When we consider God speaking the vast Universe of stars into existence, we can do nothing but stand in awe of Him"
This Biblical literalist is quite sure he knows the vital property distinguishing "natural" processes from "supernatural" action - it is of course that creators create their creations instantaneously by means of the pronouncement of suitable magic words. In commenting on Proverbs 8:27-30 where we read about God invoking wisdom as the craftsman of creation he concludes "God did not use evolution because a craftsman carries out instantaneous and deliberate actions whereas evolution involves a long random process" (p31). However, there are two glaring errors here: 1) Craftsmen don't create instantaneously. 2) To call evolution "random" is a gross misrepresentation. This literalist is captive to a false dichotomy: Viz: He contrasts what he believes to be the very random processes of evolution with what he feels are the instantaneous and deliberate acts of the craftsman. The irony is that Richard's views in terms of the concepts he employs aren't a lot different: As we've seen he is impressed by the notion of a Big-R universe. Moreover, take a look at the following which I've already quoted from Richard in a previous section of this post....
No intelligent agent needs quarks or electrons or electromagnetism or even gravity—things can just behave as commanded or designed: where things need to fall, they just fall; where stars need to shine, they just shine; where things need to stick together, they just stick together.
That is, in the mindset of both our Biblical literalist and Richard Carrier divine creation should entail no underlying logic, no process and no history; things just happen just-like-that, abracadabra style. Basically, the caricature of divine creation conceived by both Richard and our Biblical literalist is that the act of creation is brute magic. Richard and our literalist just can't conceive that God might use the resources of time and space (humanly speaking huge amounts of them) as a demonstration of the process and computational cost needed to create life. For them God is a magician who merely commands stuff into existence and Richard's theological notions in terms of concepts employed doesn't look to be a great advancement on the Biblical literalist.
We cannot but help notice that our Biblical literalist is as laughably wrong as anyone can be about the actions of a craftsman; those actions are certainly not instantaneous; if they were we might justifiably accuse the craftsmen of being magicians in league with Devil. In fact, in some ways the work of the craftsman resembles the inconceivably more sophisticated work in the womb; that is, a stage-by-stage process moving incrementally closer to an end product as time progresses. These stages proceed against a background of inherent dependencies, e.g. a craftsman can't make a silver candlestick until some silver has been smelted and an embryo can't develop without a union of the appropriate genetic components not to mention the underlying organic chemistry fundamental to all living things. Of course, it is easy to claim that an omniscient omnipotence could create in one grand slam instantaneous act a fully mature human, but the sequential dependencies I talk of here are conceptually fundamental. A silver candlestick depends on the existence of silver but silver is not obliged to exist in the form of a silver candlestick. Likewise, humans depend on a prerequisite organic chemistry which itself depends on more fundamental conditions such as the construction of atoms. There is a forced logical sequence here that we cannot escape from whether we believe in instantaneous creation or not. If God instantaneously created a mature object that would not detract from the fact that the object itself may have inherent sequences of logical dependencies.
Some concept of sequence, then, is built into things no matter how they are arrived at. But the sequencing we see in embryo growth and artifact construction is much stronger than this "dependency" sequencing. Both processes pass through a series of stages separated by increments. Each stage is usually a little closer to the final product; although this is not necessarily true in the case of the craftsmen art where sometimes a search for solutions means backtracking may occur. But the fundamental aspect of both is the incremental separation between stages. The end product is the result of an accumulation of these incremental changes. The common theme is that of a quasi-continuity of change; you pass from one state to another through a series of intermediate states, thereby forming an incremental sequence of change. I would not, however, want to use the generic term "gradualism" here because some processes like, say, an explosion, is both incremental and yet very rapid. The key notion is one of at least an approximate continuity of change in as much as successive stages are only separated by relatively small displacements.
But we must take our faulting of both Richard and our Biblical literalist yet another stage further. As we know the process of designing is also a "search" process, an experimental trial and error endeavor that in some cases has definite goals in mind and in other cases involves chance discoveries that are perceived to have utility and only then are selected to become part of the technological tool kit. There is also the complex cognitive thought process occurring in the mind of the designer, which although not visible are all part of the experimentalism as ideas are mulled over in the mind and either rejected or selected for realized reification in material technology.
All these factors combine to give us an exponentially branching network which constitutes a potentially huge search space making the space-time of the observable cosmos look like a very tiny place indeed. But the search space is considerably reduced if the creator is primed with an informational head start; that is, if the creator has useful a priori knowledge. The form of the equation which relates the information content of the configurations created as a function of starting information and the minimum possible number of computational search steps looks something like this:
I = S + Log T
Equation 1
Where I is the information content of a configuration arrived at, S is the minimum length of the algorithm/knowledge needed to generate the configuration using a minimum number of linear execution steps of T. See here where I give more details on this relation. (See also here). This relation tells us that a creative agent/process can take a lot less time if that agent has a large amount of primal starter information S. But assuming a parallel processing paradigm then when S is lacking content information in I is generated only very slowly with Log of the number of execution steps T.
It is a strong theological intuition that a proper concept of God entails an omniscient being and therefore One who has a full quota of S and hence has little need of the generation steps T. I guess that it is this intuition which influences our fundamentalist and Richard both of whom are quite sure that when it comes to creation brute omni-power means that God can just do stuff all but instantaneously and doesn't need any process with a history behind it; that is T ~ 0. But of course, that's not true of human designers for which the cognitive process of design and creation entails thought, sequence, experiment, and the trial-and-error search for good information all of which is, above all, a process with a history. In that sense both Richard and our literalist fundie have got it so wrong about designers; designers search. test, reject & select, backtrack, correct, and develop; they don't just do stuff instantaneously but rather leave behind a history of research & development; history, and plenty of it, is implicit in all human artifacts.
***
As we know our own universe displays a history; it is an object which has developed and didn't spring into existence "just like that". It is this history which biblical literalists are committed to denying with great scientific difficulty. Of course, Richard, like myself, believes in cosmic history, but he's trying to push past us the theological notion that theists should all be like Biblical literalists and postulate T ~ 0 where God stuff just falls into place at command; he is asking me to accept that creation should have little or no algorithmic logic and history behind it and he is also asking me to accept that a cosmos with logic and a long history is evidence that God doesn't exist. Moreover, as we've seen he gives the impression that he's positing a "Big R" superverse. But that, as we've also seen, has little or no evidence going for it and can be justifiably called a bizarro universe as it can be used to explain anything. And while on the subject of bizarro explanations: To me, the concept of the abracadabra God, a concept shared by Biblical literalists, also qualifies as a bizarro God because just about anything goes; see for example my "Beyond Our Ken" series. In fact, it may be that much of Richard's theological concepts stem from his experience with the North American Intelligent Design community and fundamentalist organisations like Answers in Genesis.
For myself the Big R supervesre is as unlikely as the abracadabra God. Neither notions have evidence in their favour; Big-R predicts instabilities in the organisation of the cosmos, instabilities we don't observe, and abracadabra predicts a universe without a logical history, a universe Beyond our Ken.
It is clear our universe has a history of development and this is particularly evident with life and geology, although may I say that I'm not committed to any particular engine/mechanism of evolution. However, I would tentatively submit the idea that the size of the creation is a divine revelation to humankind of the computational costs of a universe such as ours, a universe which is so obviously specialized for developing and supporting life. Another speculative notion which I would like to submit is that our universe may well use expanding parallelism and teleological constraints in order to generate life; this would get rid of the "slow" Log T term in equation 1 above, an equation which pertains to vanilla parallel processing. However, all that is very speculative, and the last thing I want to do is to be like Richard and our foolish literalist fundie who have made their minds up and think everyone else should follow suite, or else be called nasty names by them and their followers.
Lastly let me comment on this quote from Richard:
Intelligent engineers aiming to create life don’t make the laboratory for it vastly larger and older and more deadly than is required for the project...etc etc,
That may well be true. But the minimum space-time dimensions of a cognitive "laboratory" depend entirely on:
a) the initial knowledge of the engineers, that is the value of S, and
b) the configurational complexity of the task in hand which dictates the minimum value of T given S.
So, if the level of providence the Good Lord has provisioned our universe with measured in terms of its initial algorithmic complexity (S) and the time and space set aside for cosmic development (T), then the incredible sophistication and complexity of life very likely dictates the large space-time dimensions of our cosmos. From where I stand the cosmos looks very much like an ingenious piece of computational engineering built around equation 1 above. It certainly isn't a tiny piece of an immense Big-R cosmos, or something created last Thursday with a built-in bogus maturity (The omphalos hypothesis).
In part IV I will continue to examine Mr. Richard Carrier's theological assumptions.
No comments:
Post a Comment