Saturday, January 24, 2015

Chaoskampf, Cancer and Conspiracy

The Cancerous Chaos Beast: the mythology of the mundane and banal

For the second time in as many weeks we find evangelical atheist Larry Moran "warming" his heart in a perverse nihilistic sort of way with news that  randomness once again has a big influence in the affairs of this world; the first time it was random drift as one of the important engines driving evolution, this time its about 50% of cancers being caused by nothing in particular but a bit of bad luck. All this is very much in line with Larry's vision of an acausal world largely driven by meaningless forces, empty of purpose.  Funny part about it is that I'll have admit he may well be right about this; from spontaneous corruption of genetic information in cells to stray cosmic rays and background radio activity, such influences are unlikely to be traceable to any deeper causes. I also think he may be onto something when he says that this is a message that many people do not want to hear; the implication is that there is human resistance to accepting that meaningless, goalless forces can play a large and decisive role in life. Moran quotes David Gorski, a quote I reproduce below with my emphases in bold:

It’s understandable that humans crave explanation, particularly when it comes to causes of a group of diseases as frightening, deadly, and devastating as cancer. In fact, both PZ Myers and David Colquhoun have expressed puzzlement over why there is so much resistance is to the concept that random chance plays a major role in cancer development, with Colquhoun going so far as to liken it to ” the attitude of creationists to evolution.” Their puzzlement most likely derives from the fact that they are not clinicians and don’t have to deal with patients, particularly given that, presumably, they do have a pretty good idea why creationists object to attributing evolution to random chance acted on by natural selection and other forces.

Clinicians could easily have predicted that a finding consistent with the conclusion that, as a whole, probably significantly less than half of human cancers are due to environmental causes that can be altered in order to prevent them would not be a popular message. Human beings don’t want to hear that cancer is an unfortunately unavoidable consequence of being made of cells that replicate their DNA imperfectly over the course of our entire lives. There’s an inherent hostility to any results that conclude anything other than that we can prevent most, if not all, cancers if only we understood enough about cancer and tried hard enough. Worse, in the alternative medicine world there’s a concept that we can basically prevent or cure anything through various means (particularly cancer), most recently through the manipulation of epigenetics. Unfortunately, although risk can be reduced for many cancers in which environmental influences can increase the error rate in DNA replication significantly, the risk of cancer can never be completely eliminated. Fortunately, we have actually been making progress against cancer, with cancer death rates having fallen 22% since 1991, due to combined efforts involving smoking cessation (prevention), better detection, and better treatment. Better understanding the contribution of stochastic processes and stem cell bio
logy to carcinogenesis could potentially help us do even better.


So why does this pique my interest? It's because it has tell-tale similarities with the human tendency toward paranoia and conspiracy theorism; conspiracy theorism is the imaginative multiplying of the machinations of sentient entities behind the scenes thought to be engaged in deception and/or acts against us. The motive for conspiracy theorism seems in part down to an unwillingness to accept that mundane and banal factors often have a big role in fortune; we may be tempted to feel that that fortune is worthy of grander narratives to explain it, narratives whose star turns are evil Machiavellian agents. This can have the effect of dignifying and mythologizing human struggles against the chaotic and the random. Personification of human woes can be cathartic because it provides a sentient target that anger can be directed toward. Cancer, with all the difficulties in its successful treatment, is prime material for conspiracy theorism. And yet this is in spite of the fact that in Western Christian tradition, the concept of Satan has a close association with  idea of the chaoslkampf, the beast who emerges from the abyss of chaos, the seething cauldron of randomness.  Not really very heart warming stuff I would have thought!

Wednesday, January 21, 2015

Misplaced Concreteness: The Theology of the Homunculus.


Is there an homunculus behind the scenes driving evolution?

Astronomer Otto Struve (1897 – 1963) is quoted as saying this:

An intrinsically improbable event may become highly probable if the number of events is very great….It is probable that a good many of the billions of planets in the Milky Way support intelligent life. To me this conclusion is of great philosophical interest. I believe that science has reached the point where it is necessary to take into account the action of intelligent beings, in addition to the classical laws of physics.

This statement is of IDlogical interest on several counts.

ONE: As Struve tells us sufficient trial resources (in terms of planet numbers in this case) can turn the improbable into the probable. But Struve is thinking in mere billions; this in itself is far from sufficient. For life to be generated with a realistic probability we must select one or both of two mathematical conditions: 1) An a priori physical regime which constrains the physical possibilities to such an extent that trial resources quantified with a mere 3 digit logarithm are capable of reaching life, and/or 2) A physical regime which is capable of returning trials whose number can only be quantified with a very large logarithm. *1

TWO: Interesting to note that Struve sets up an intelligence vs. physical law dichotomy –  at first sight this looks very much like the kind of thinking behind the explanatory filter epistemic of the de-facto Intelligent Design movement. But unlike the North American ID community I doubt very much that Struve was arguing from a subliminally theological position. More likely he was arguing from the point of view that “Law&Disorder” physics is a primary causal agent, whereas intelligent life is a secondary causal agent derived from physics; i.e. life is a product of the cosmic physical regime. I suspect that Struve is following the mainstream academic view that separating out physical action and intelligent action in to different categories is not a fundamental category division but one of utility: Deriving life from physical first principals is analytically difficult (if not impossible) making necessary an artificial discipline division where a higher level phenomenon like life is dealt with more descriptively by biologists.  In a similar way geologists who deal with complex geological processes don’t always work back to the first principles of physics, but cut the knot by talking about a separate category of “geological forces”; that’s not to say, of course, that geologists believe “geological forces” have any vitalistic basis. Likewise, most biologists are likely to believe that the kind of intelligence Struve is talking of would trace back to the generating power of basic physical processes. Needless to say atheists would favour this philosophy, a philosophy which doesn’t take intelligent agents as a given with all the potential there of smuggling in the divine. Of course, the de-facto IDists believe exactly the opposite; for them intelligence (subliminally, divine intelligence) must be taken as a given when reckoning up creative processes.


THREE: In the foregoing we’ve seen how intelligent action is put into a different category to “natural forces”, although for atheists this is done for utilitarian rather than fundamental reasons. It is surely ironic that this manoeuvre readily leads on to the Intelligent Design community’s explanatory filter epistemic. In this epistemic intelligent action is effectively placed on the same logical level as Law&Disorder explanations. In fact in most everyday contexts the explanatory filter of the IDists is robust; after all, something like this epistemic is used by archaeologists when they are trying to decide whether an object is an artifact or of natural origins. In short the explanatory filter is exactly the method one uses when one is faced with the possibility of action by either humans or aliens; but what about for God?

FOUR: That the de-facto ID community use the explanatory filter, a filter comfortably used by archaeologists and implicit in Struve’s statement above, says a lot about de facto ID theology.  This theology is a dualist theology where God has become an homunculus-of-the-gaps default agent of causation who acts almost within the cosmos and is invoked as an explanation when “in principle” physical explanations are difficult if not impossible to find. As I’ve complained many times before on this blog, this theology has the effect of setting up two mutually exclusive categories of causation, namely the physical and the divine. We know of course that what the de-facto ID community obliquely refers to as an “Intelligent Designer” is, behind closed doors, identified with God rather than aliens. God thereby becomes  a distinct “cause” to be lined up in an identity parade of all the other possible agents of “causation” that work within the cosmos. This has the pernicious effect of placing God very much inside creation like some super-alien, violating at a stroke both His eminence and immanence.

Moreover to talk of God being a “cause” also does an injustice to God. The notion of “causation” is itself very much a concept derived within the context of the contingent patterns of behaviour displayed by the physical universe; causation, in fact, can become difficult to define or even undefinable in connections where we are dealing with timeless patterns and/or disorderly patterns. Admittedly when talking about God it is almost impossible to do so without using metaphors based on our experience of this world, but some metaphors are not as good as others and to rank God as an agent of causation in a very literal sense is particularly insidious; the fallacy of the Kalam argument is a sign of this.

 FIVE: For myself I prefer the metaphor of the cosmos as a giant thought pattern or story created in the mind of God; it’s as if an author like Tolkien created and maintained his world of Middle Earth in his mind rather than reifying it in physical print. This metaphor satisfies to some extent the theological demand that God is both eminent and immanent in relation to the very contingent patterns of our world. We are effectively immersed in God rather than God being a homunculus who is immersed in creation as an ancillary agent of causation, occasionally turning up to do something special. The immersed human perspective on the physical workings of our world is a bit like the perspective of someone zooming in with a powerful microscope and looking at the behaviour  of individual neurons of the human mind and then wondering where the intelligence is; one only finds that intelligence at the high system level, in the big picture.

Above all, this metaphor satisfies the theological requirement for the otherness of God: The patterns of our world are contingent with no logical necessity and therefore very much other than the presumed aseity of a God who hosts them. These contingent patterns have been dragged out of platonic space and reified in an act of creative divine thought. In the sense that these patterns have a kind of immaterial platonic existence prior to reification, gives them an existential status that preserves the otherness of God; just as Tolkien's Middle Earth is other than Tolkien, so the cosmos is other than God.

It is tempting to think of the distinction between God and creation as bound up with a distinction of “substance”. But identity of “substance” is a derived concept based on our experience of the macroscopic physical world where material object integrity is only maintained by clear spatial separations and demarcations. This concept of substance breaks down, however, in the microscopic world of identical particles where it becomes clear that distinction of substance can only be maintained by clear distinction in patterns of behaviour, patterns determined by such properties as charge and mass.*2 That is, “substance” is bound up with the extrinsic properties bestowed by patterning and is not an intrinsic property.

Footnotes.
*1.  What do you do if the universe only has a very limited number of particles, say 10^80 and therefore has very limited “trial resource” capability? Simple; you use quantum mechanics, a method whereby the possibilities open to a collection of particles are all explored at once. Individual particles are then effectively “smeared” over large volumes.

*2. Identity of “substance” is a problematical concept in the context of sheer patterning. Consider for example a binary pattern where we have two separate digits both set to “1”. Our use of common language, a language used to dealing with distinct concrete objects, tempts us to talk of these binary digits as distinct entitles, as if they had their own separate "substance"; but if this were true it would be possible to swap the digits and then claim that each separate digit has been moved. But as per quantum statistics no change has actually taken place and the "swap" doesn't count as a distinct combinatorial item. Ergo, talking about “swapping digit positions” is only a figure of speech and is otherwise meaningless.

Relevant Links:
http://quantumnonlinearity.blogspot.co.uk/2014/11/western-dualism-in-north-america.html

Saturday, January 17, 2015

The Road to Somewhere


Random drift/walk, backwards or forwards. As long as you don't come off the rails you'll end up somewhere.

There has been a little flurry on the internet about the question of the Zebra's stripes: What function do they serve and how did they arise?   Evangelical atheist Larry Moran offers a possible alternative to the adaptive explanation: 

There's a fifth possibility: maybe there's no reason at all and stripes are just an evolutionary accident.....The point is that the prominence of stripes on zebras may be due to a relatively minor mutation and may be nonadaptive. That's a view that should at least be considered even if you don't think it's correct. 

This answer no doubt fits in very well with Larry's philosophical position as does his interest in "random drift" as one of the mechanisms of evolution; he might be right, after all he's the expert. But he seems to have little cognizance of the actual role of randomness in the greater scheme of things: For standard evolution to have the slightest chance of getting results the "random diffusion" of evolution must be working within a very constrained space indeed. Yes, evolution may go "backwards" and "forwards" at random, but for that to happen the current theory requires  "rails" or "constraints", in terms of the fitness space*, to be a given.  This is something that Larry just hasn't wrapped his head round. Am I being stupid in wondering why?

Relevant Links: 

Footnote
* If the fitness space is an "information" given then it follows that the problem of finding life has effectively been solved before the computation of evolution has started; This view is not a horse I'm backing. See http://quantumnonlinearity.blogspot.co.uk/2014/09/declarative-cognition-vs-procedural.html


The  General Theory of Evolution: But it only works if you've got rails

Saturday, January 10, 2015

More Ham Fisted Science Denial

An illustration in a book by "America's leading science denialist"

For the record I thought I would make note of the latest blog post by Ken Ham. In his post he engages in his usual science denying gambit whereby he attempts to yet again attempts to prise apart observational and historical science - see "Creationists Don't Deny Science" 9th Jan. Evidence that this travesty is well and truly endemic among "Answers in Genesis" staff can be seen from the link that Ham gives to an AiG article: Viz: 

answersingenesis.org/what-is-science/two-kinds-of-science/

I have dealt with this mockery of science in more than one post, and here is my latest post:


Let's face it; they are never going to learn. These fundamentalists have a great need to explain and justify to themselves their essentially anti-science, anti-academia ethos.

Interesting is Ham's quote of Karl Giberson who justifiably criticizes Ham:

....He [Giberson] states, “Science denialism is alive in the United States and 2014 was yet another blockbuster year for preposterous claims from America’s flakerrati. To celebrate the year, here are the top 10 anti-science salvos of 2014.” He then proceeds to list AiG as number one and even calls me “America’s leading science denialist.” But do creationists deny science? Of course not!

 "But do creationists deny science?" Of course they do! Also interesting is this further quote from Giberson:

[Ken Ham’s] greatest howler, however—and my top anti-science salvo of 2014—would have to be his wholesale dismissal of the entire scientific enterprise as an atheistic missionary effort.

....a sign of the fundamentalist marginalization that can sometimes lead them to embrace conspiracy theorism; they are at odds with such a large section of academia that this can encourage them to resort to conspiracy theories in order to  give account of their denialism.  But it's not just science they deny: they also have to deny history: For example, AiG's dating of the Genesis Flood conflicts with the dating of the pyramids and so AiG is in conflict with Egyptology as well.

History denial: No mention of this in the Bible, nor anywhere else for that matter!

Sunday, January 04, 2015

The Epistemic Underwriter

We depend on the cosmos being dependable, readable, rational, coherent, intelligible and above all having epistemic integrity.

Here's one of the latest comments I found on Jason Lisle's blog. (Research Update August 2014) It's significant because I probably have common ground with  Lisle on this topic and would largely agree with him.


Stefan Frello says:
I should have been a little more precise. In order for a theory to be scientific, it should follow Occam’s razor. You should be able to make predictions from the theory, and it should be falsifiable (because it is convenient, If not for other reasons).
  • Dr. Lisle says:
    The problem with this answer is that it disconnects science from reality. That is, your view of science is that it is a way of answering questions that is “convenient” but not necessarily true or having anything whatsoever to do with reality. In the Christian worldview, science is a tool we use to answer certain questions about the actual universe. And it works because God upholds the universe in a consistent way with patterns that we can discover with increasing probability (though not necessarily certainty) using our mind and senses that God designed. But on your worldview, there is no reason to think that the procedures of science, including Occam’s Razor, have anything whatsoever to do with reality. Thus, your worldview cannot justify science as a method of obtaining empirical knowledge. And that’s been my point all along. Only Christianity can do this.

What I would add to this is that we come to God in an a priori way: "In the beginning God the Father and therefore the universe will make (some) sense because his creation will be coherent and intelligible". However, this approach can't be offered as "a proof of God" as I have heard some Christians suggest; we can only start with the basic ideas of cosmic integrity and rationality sourced in a personal God and then exploit this concept, a priori, as the corner stone of our epistemology. That is, we can't  infer God from an intelligible universe, but if we posit a God of integrity we can infer a comprehensible universe.**

If some atheists choose to exploit an epistemic that depends on an intelligible universe as a given "brute fact", then in spite of them not believing that the success of this epistemic is sourced in God, they too will reap the reward of noetic riches; although they won't be able to relate this success to any higher reason.... for them "it just is".

Good 'ole Jason Lisle - I agree with him for once! Where I would probably disagree with him is in a typically fundamentalist lack of generosity to atheists*: Atheists who trust the integrity and rationality of the universe as a given and consequently make scientific advances to the benefit of us all are doing God's work and are glorifying him. Moreover,  when I look at the general state of the religious world I can hardly blame them for being atheists. Fundamentalists of all religious brands give me the creeps and constitute some of the best arguments for atheism. But the sad fact is, atheism sometimes finds itself teetering on the edge of the chasm into nihilism, the abyss where the chaos monster lurks.

Relevant Links:
http://quantumnonlinearity.blogspot.co.uk/2013/08/epistemic-notes.html

Footnotes:
* Fundamentalists also lack generosity to fellow believers who don't follow their views (if indeed they even think of them as fellow believers). See for example the link below where we catch Jason Lisle thrusting blasphemies and heresies into the mouths of fellow Christians.
http://quantumnonlinearity.blogspot.co.uk/2012/11/once-again-false-dichotomy-zone-god-did.html
** But that said we have to acknowledge that often our world makes little moral sense - basically that's the problem of pain and evil. 

Wednesday, December 31, 2014

People in (Epistemic) Glass Houses…..



This post by “News” (= Denise O’Leary) on Uncommon Descent raises questions about the nature of scientific epistemology. The redoubtable Ms. O’Leary starts with some quotes and then comments on them. Below I reproduce the article along with my own interleaved comments.


Breaking: Article in Nature defends integrity of physics against multiverse, string theory
December 18, 2014
Posted by News under CosmologyIntelligent DesignNews
“Scientific method: Defend the integrity of physics” by George Ellis and Joe Silk,” Nature, open access:
This year, debates in physics circles took a worrying turn. Faced with difficulties in applying fundamental theories to the observed Universe, some researchers called for a change in how theoretical physics is done. They began to argue — explicitly — that if a theory is sufficiently elegant and explanatory, it need not be tested experimentally, breaking with centuries of philosophical tradition of defining scientific knowledge as empirical. We disagree. As the philosopher of science Karl Popper argued: a theory must be falsifiable to be scientific.

My Comment: This kind of epistemic manoeuvring is, to my mind a) forced on us in sciences that investigate complex “high level” and/or less than accessible objects such as we find in sociology or world view synthesis b)  not as worrying as the authors of this quote think as it is necessarily widespread and may be little different for physics at the high end. It would be very nice if all theories could be predictively tested against experiential protocols, but the fact is testing at will is not always an option in the face of epistemic intractability; we may have to fall back on trying to assess just how well the theory makes post-facto sense of the data samples to hand. Moreover, "falsifiability" provides no sharply defined criterion for demarcating "good" or "proper" science, because no theory is absolutely falsifiable; we can always, with a bit of imagination, appeal to hidden adjustable variables in order to “explain” away anomalies, although this kind of special pleading, if used in quantity to prop up a failing theory, can start to look a little contrived; multiplication of variables, if reality is really that complex, considerably reduces the chance of us hitting the right combination of variables. It is no surprise then that everybody hopes that Occam’s heuristic is right, a heuristic that works on the presupposition that the world is rational and simple, with few variables and therefore with less opportunity to get things wrong!

Earlier this year, championing the multiverse and the many-worlds hypothesis, Carroll dismissed Popper’s falsifiability criterion as a “blunt instrument” (see go.nature.com/nuj39z). He offered two other requirements: a scientific theory should be “definite” and “empirical”. By definite, Carroll means that the theory says “something clear and unambiguous about how reality functions”. By empirical, he agrees with the customary definition that a theory should be judged a success or failure by its ability to explain the data.
He argues that inaccessible domains can have a “dramatic effect” in our cosmic back-yard, explaining why the cosmological constant is so small in the part we see. But in multiverse theory, that explanation could be given no matter what astronomers observe. All possible combinations of cosmological parameters would exist somewhere, and the theory has many variables that can be tweaked. Other theories, such as unimodular gravity, a modified version of Einstein’s general theory of relativity, can also explain why the cosmological constant is not huge.
Some people have devised forms of multiverse theory that are susceptible to tests: physicist Leonard Susskind’s version can be falsified if negative spatial curvature of the Universe is ever demonstrated. But such a finding would prove nothing about the many other versions. Fundamentally, the multiverse explanation relies on string theory, which is as yet unverified, and on speculative mechanisms for realizing different physics in different sister universes. It is not, in our opinion, robust, let alone testable.

My Comment:  Carroll is implicitly admitting that some of these exotic physical theories are not easily testable at will, although they do in his opinion make sense of accepted “empirical” evidence. He therefore advocates relaxing the requirement that a theory should make testable predictions, but demands that a theory at least make unambiguous empirical post-dictions about our cosmos. As I have said many times in this blog some ontologies are a lot less epistemically tractable than others, (That would certainly apply to multiverse ideas for example) and this entails these ontologies being less amenable to data sampling. I have no particular objection to Carroll’s wanting to relax the epistemic standard so long as he acknowledges the risks and the loss of empirical authority. Moreover, if Carroll has at last realised that some ontologies are less empirically responsive than others he ought to also realise that he has stumbled upon a sliding scale that can be pushed even further. Viz: Some theoretical objects, particularly in the social and historical sciences, don’t have an unambiguous connection with observational protocols. but make probabilistic post dictions.  So, on balance I’ve no complaints about Carroll’s epistemic procedure provided he doesn’t start pushing it as part of the authoritative status quo that gives him a pretext to kick dissenters into line.
But the redoubtable Denise is less sympathetic:

No wonder some would like to abandon testability for elegance, and reality for fairy tales.
Unfortunately, the plea ends on a somewhat tinny note,
“The imprimatur of science should be awarded only to a theory that is testable. Only then can we defend science from attack.”
Guys, listen (yes, you George Ellis and you Joe Silk, it is you we are looking at): The problem really isn’t attacks from outside. Quit fooling yourselves.
The problem is entirely within. If physicists want to join the many and various advocates of self-expression who do not depend on rigorous examination of evidence to validate their assertions, that is a choice physicists make.
No one forces that choice on physicists. But they are free to make it.
It sounds as though some of your colleagues have been making just such choices, and defending their choices by asking for exemption from traditional standards. It’s your profession’s call to determine whether their wishes/demands can be accommodated simply to prop up whatever rickety theoretical structures they have built.
But if your profession does choose to accommodate, two things:
1. Physics becomes just another player in a culture war, with no more genuinely respectable claims for attention than the demands we hear daily from grievance warriors that their version of events be accepted without cavil as Truth. You could find yourselves currying favour with politicians, as an identity group, for your version of nature versus that of magical thinking. Is that really what you want?
2. If so, just remember, no one did that to you. You did it to yourselves.
See also: The bill arrives for cosmology’s free lunch


My Comment: …but she’s probably right in her drift: The kind of “high level” barely accessible ontologies Carroll is proposing lose something of their empirical authority and have more the flavour of a world view synthesis. They therefore should not be pushed as logically obliging “Truth” or “fact”; that’s the sort of thing fundamentalists do. Conversely, I’m sure Denise realises that her ID community also have a science that is not good at making unambiguous predictions, and is better regarded as a post-facto quasi-archaeological sense making proposal. So, all in all there are lessons in mutual understanding here for both Denise O’Leary and Sean Carroll.

But when it comes to foisting on people world views that masquerade as “Truth”, I avoid communities and cultures that are predatory and use "moral" duress, group pressure and worse to persuade: Crowds of people with a highly uniform world view have always given me the creeps.


The voice of the crowd
is nothing but loud;
the nod and the wink
supports a group think.
It may be baloney.
Beware the crony.


Some relevant links:

Friday, December 19, 2014

Melencolia I Part 5: Creating Information

The dream goes on!

The latest paper in my Melencolia I series can be obtained here . I reproduce the introduction to this paper below:


Introduction
As we saw in the previous post of this series, there is a class of configurations, a class I called “complex”, with configuration sizes less than the logarithm of the time needed to generate them and therefore they cannot be generated in practical times with “conserved” parallel deterministic processing. In this paper I develop a very similar looking relationship for non-deterministic “conserved” parallel processing. By “conserved” I mean computations that use fixed resources in terms of processor power (Although I assume unlimited amounts of memory and time are available).  This conclusion leads onto to a brief consideration of “non-conserved” processing and a proposal that non-conserved processing is one of the conditions of intelligent activity.

I have increasing doubts that classical evolution, which plods along with a classical form of conserved parallel processing, has the efficacy to generate and select life. In contrast I have an increasing conviction that somehow evolution is a process that exploits what appears to be the potentially available expanding parallelism of quantum mechanics. And moreover, as I hope to explore in later posts, the requisite criteria by which configurations are selected from the rapidly generated configurations of expanding parallelism, turns a mindless imperative process into an intentional declarative process.

Links to the previous posts in this series:

Also relevant: