Pages

Friday, March 06, 2020

Breaking Through the Information Barrier in Natural History Part 4



In this latest post on the subject of Algorithmic Specified Complexity (ASC) I had intended to comment on Joe Felsenstein's post here where he critiques the definition of ASC as defined by Intelligent Design gurus Nematti and Holloway along with their claim that ASC is conserved. But before I look into Felsenstein's post (which I will now do in my next post) I need to look at another issue with ASC which has turned up and so I will briefly deal with this instead.

Features of ASC
As we saw in part 3 of my examination of Nematti and Holloway's (N&H)  notion of ASC, ASC is only conserved under a subset of conditions. But as we also saw in part 3 it can be used to detect the mid range complexity of organised structures such as life. This becomes clear if we pick up equation 7.0 in the last part. Viz:

ASC(L, C) = I(L)  − K(L + K(C)
1.0

This equation quantifies the ASC of a configuration L given the "context" of  a "library" of data and algorithms represented C and where I(L) is the Shannon information value of L. I(L) is the information calculated on the basis of the chance production of L. The functions K(L) and K(C) are the smallest algorithms that will define configurations L and C respectively. I have assumed that C has been reduced to the minimum data and algorithms needed to generate L. The configuration L could be a living configuration. As we saw in part 3, for life K(L) and K(C) will have similar values and approximately cancel in equation 1.0. Therefore because I(L) must be high for living structures then the ASC  associated with life is high. In contrast, as we also saw in part 3, for a randomly generated L ASC is very probably at a low value close to zero. For cases like crystals and polymers where we have highly ordered structures the functions K(L) and K(C) will return low values, but I(L) will also be low because the probability of crystallisation is relatively high and hence a low value of ASC is returned by this kind of high order.  (But there is an issue here as we shall shortly see).

ASC really doesn't tell us much about the exact nature of L; it could be some intermediate expression of ordered complexity where we keep information in the library C  so that:

K(L) ~ K(C)
2.0

In other words ASC isn't a way of particularly distinguishing living configurations; it is just away of registering an intermediate complexity, which in many cases does result of direct intelligent action.

I must concede that my analysis above depends on what may be a violation of one of the big assumptions of the de facto ID community: Viz: I have assumed that C, the context where it is assumed an intelligent source of input is to be found, can be expressed in terms of data and algorithms; that is, I am assuming that C and therefore the intelligent agent that is part of C is computable: As IDists appear to be fond of the idea that somehow intelligence must remain sacrosanct and beyond analysis this has drawn some of them toward the notion that intelligence displays incomputable properties and therefore would not return a definable K(C). See for example IDist  Robert Marks whom I quote here:

I am starting to believe creation of information requires a nonalgorithmic process, hence intelligent design.

To my mind this is counter-intuitive: Why do we use computers to solve problems and return information about things we don't know? Wouldn't it be sensible to develop and use a concept of information which expresses this intuition? No, not for the IDists because they work with a preconceived notion that information remains static and therefore they look for a definition of information that fulfills their expectations of stasis.

Another Issue with ASC
In the foregoing I have glossed over an important point. The functions K(L) and K(C) have fairly clear definitions, but how do we define the Shannon information term I(L)? Is it calculated from conditional or absolute probabilities? If I(L) is measured absolutely then even the monotonous periodicity of a crystal returns a high value of ASC: This is because the absolute probability of high order is extremely low and therefore its Shannon information is high. But high order like this has a low K(L)  and a low K(C).  Therefore 1.0 returns a high value for crystals. I don't think this is the result that N&H would want.

So, we conclude that I(L) in the expression for ASC must be measured conditionally if it is to give us the kind of ASC value N&H would look for in crystals. But what is crystal formation conditioned on? Presumably the laws of physics: If the laws of physics didn't exist to constrain physical behaviour crystal structures as an ordered class of configurations would have a very tiny probability and therefore a high Shannon information leading to a high ASC for crystals, as we have already seen. We don't want crystals to return a high ASC because we know that there is an apparent non-intelligent agent, namely inter-atomic forces, which can explain crystal formation without having to resort to intelligent  agency.

At this point we must now use  the Shannon information relationship I used in part 1 where we had : 

I(q) + I(r) = I(p)
3.0
....where p is the unconditional probability of life and where q is the conditional probability of life given a physical regime whose probability is r

Which information value,  I(q) or I(p), do we use to calculate the ASC of living configurations?  If we are supposed to use the absolute probability of life and use I(p), then using the same practice for crystals it follows, as we have seen, that crystals have a high value of  ASC because I(p) for crystals is huge. But if we use the conditional probability of crystal formation then that suggests we must also use the conditional probability of living configurations and therefore use I(q) for life. But if we are to use I(q) for life, we are faced with an unknown; that is, what is the conditional probability of life? Who knows the probability of life given our physical regime?

Well,..... although I say "unknown" opposite sides of the debate think they know the answer here. Ardent evolutionists are likely to be drawn toward supporting the idea that the laws of physics provide sufficient constraint for standard evolution to have a realistic probability of generating life (implying that the spongeam must exist). After all, as they might argue, empirical evidence suggests that standard evolution has actually taken place as observed in the fossil record of natural history. On this account the conclusion would be that I(q) for life has a relatively low value and therefore has a relatively low ASC value.

On the other hand IDists are likely to deny this because for them those secular and profane "natural forces" are not sacred enough to generate life: Their effective denial that a high conditional probability for life is implicit in known physics means that life could only be generated by "natural forces" via a chance event of very small probability; that means that I(q) has a very high value therefore implying a high value of ASC for life. For them only intelligent agency can generate life and as we have seen intelligent agency for people like Robert Marks is so mysterious that it is not reducible to algorithmically expressible processes.

So ASC turns out to be a pretty useless measure: Its value doesn't provide any sufficient reason to judge between these two points of view for the simple reason that its value is dependent on one's a priori world view.

***

I find myself in between the devil and the deep blue sea here: The devil is the Trump voting, gun toting, gay persecuting, anti-vaxxers, and also on occasion the fundamentalist & conspiracy theorist leaning right wingers. (If they think this assessment unfair they had better do something about their PR). The deep blue sea is the atheist nihilist abyss.

As I have so often said, on the one hand I don't think current physics supports a spongeam and therefore standard evolution is not implicit in known physics (see once again here; physics provides a "construction set" of parts with their fixings but not a dynamic sufficient to generate life), but on the other hand I would at least question the IDualist's contention that intelligence has a special, inscrutable and vital "information creating" mystique, a mystique beyond human intelligence's own capacity for self-understanding; or if I may express it the other way round, I would at least question those who are unable to attribute a special mystique to matter, matter which is the Divinely created miraculous stuff making up brains and where brains are the third person's perspective on conscious cognition's first person perspective. Moreover, as we have seen and will continue to see algorithms do create information by any sensible and intuitively agreeable definition of the term "information": Viz: We use computers to solve declarative problems and/or to generate unanticipated complexes of configurations; that is, to create information extracted from the platonic realm that is otherwise unknown to us. In contrast IDualists persist in the search for the authoritarian straight-jacket of a one-size-fits-all formal definition of information which supports their ill defined and preconceived contention that information is conserved. What they haven't bargained for is that so far their success has been marginal and in any case it is very likely that one can define an agreeable formal concept of information which expresses our everyday experience that information can be created.

Where I think both the atheist evolutionists and IDualists fall down, however, is that they are both missing the role that teleology plays in computation and human intelligence.  Teleology is the key to the whole affair: Teleology declares and specifies, whereas procedural computation is a mill for creating configurations which declaration and specification then rejects or selects in order to secure a searched for end result. The outcome is information creation.

No comments:

Post a Comment