Pages

Thursday, February 01, 2018

The Universe is not necessarily epistemically friendly/tractable

"Why don't we like it when the universe  makes us smarter?" asks James Knight.
 Perhaps because it's not always going to make us smarter, especially if part
of that universe includes our much loved social group.


The notes at the end of this post (that is, below the asterisks) were my first response to this post on James Knight's blog. These notes of mine actually first appeared on James' Facebook discussion group in response to his blog post.  Here's the first part of James' post:

The widespread human aversion to correction is one of the most peculiar of all peculiarities. People don't like being shown to be wrong - so much so that they'd rather intransigently yoke themselves to a comfortable falsehood than open themselves up to a refreshing new fact or an illuminating experience of improved reasoning. There are multiple causes of this, with some degree of overlap - the usual offenders are:

1) Lazy-thinking - the path of least resistance is, by definition, the easiest method of approach. It takes time and effort to acquire knowledge and develop your reasoning skills, and relatively few people bother to do this with any aplomb.

2) Status and ego - some people find it hard to admit they're wrong, so would rather stubbornly close themselves off from revising their erroneous opinions.

3) Tribal identity - many views and beliefs are bound up in the identity of a particular group or allegiance, particularly religious and political views, which overwhelmingly bias individuals against changes of mind.


4) Emotional biases and confirmation biases - reasoning ability can be impaired by emotions, and conformation bias occurs as we look to justify our views by seeking out information that supports what we already believe.

There are others too, but those are the main four, and between them they have quite a stultifying effect on human beings' ability to be correct about things. The only cure for this sort of thing is to wake yourself up to how painstakingly, ludicrously irrational this is - I mean, why *wouldn't* you want to be correct about as much as you can be? And related to that, why *wouldn't* you want to be shown an improved way of thinking about a situation or learn a new fact?


He then goes on to give us some tips on how we might go about making some course corrections. In this post I'm less interested in how to make corrections than the question of why this noetic inertia exists in the first place. My first reaction is that there might well be a rationale behind the apparent bullishness people have toward their intellectual position.

Epistemology is far from an exact science, especially when it comes to the epistemics involved in the creation of those all-embracing narratives which claim to be some kind of coherent theory of everything. The jump from the millimeter by millimeter snail's pace progress in the formalised investigations of basic spring extending and test tube precipitating science to the vision of a comprehensive world view synthesis requires a risky leap into the unknown; I know, I'm in that line of business myself.  One would think, however, that a certain amount of epistemic humility in regard to these grand world view narratives would be in order, but no, these narratives, if anything, are often held with the utmost certainty and it is these narratives which are frequently the rallying point for those who are on an authoritarian mission to convert all and sundry to their views.

Epistemology, as I've already said, is not an exact science which allows an array of accepted data points to be linked into a coherent whole with meticulous and faultless logic.  "Heuristics" is the word that comes to mind here. Human beings are complex adaptive systems which use heuristics to the solve the problems of knowing. Consequently, the by product of the search for truth is likely to come with a lot of blind alleys and error. It's like solving a maze problem, a maze which may entail many blind alleys being searched for the price of an eventual positive result. But if there is to be a chance of a solution at all there has to be an initial motivation to search in spite of the over-head implicit in the trial and error process.

Because humans and the world they inhabit are both highly complex it is difficult to ascertain with any certainty whether what seems to be an illogical epistemic strategy may have some hidden statistical pay off. There is, in fact, a self referencing issue with James' point above: As we consider the whys and wherefores of human epistemic intransigence we are in effect turning the tools of epistemology in on themselves. That is epistemology has itself become the subject of an epistemic endeavor as we seek to discover just how and why our epistemic works, or, as is often the case, doesn't work. Without prejudging the question it may be that the kind of epistemic arrogance we so often see has some kind of statistical pay off and is not entirely maladaptive. Perhaps, this epistemic arrogance is adaptive in some circumstances but not in others; consider, for example, gambling where an entrenched belief that the chances are somehow skewed in your favour is maladaptive from a financial point of view. (although perhaps you just get a kick out of the game play!)

Epistemology, as with other complex subjects, cannot be tied up into a closed ended bundle of general catch all principles. We therefore have to take the "complex adaptive system " approach in our study of epistemics; Complex adaptive systems don't work just with catch-all principles, but also respond to the feedback they are getting from the instance in hand and adapt to that feedback on a case by case basis; this is necessary when the catch-all principles involved are either not understood or simply don't exist

Naturally enough as a Christian I have at times on this blog talked about the role of faith/trust in one's epistemology; that is, if the universe has some predisposed propensity toward rational integrity and we believe it and exploit it then a measure of success in our epistemic endeavors is assured. Epistemic success is not something that would happen in an erratic or completely random universe or if some will was out to mislead us and systematically skew the data samples toward error.  It is, however, clear that faith and trust are not sufficient in themselves to cut the epistemic knot: Witness  the many Christian sects who display huge faith and trust in their world view and yet get it badly wrong; from end of world prophets to anti-science young earthists we see groups of people who are utterly confident of their rightness thereby inflating the language of Christian devotion and guidance until it is of very little value indeed  (See here and here).

Anyway, below are the notes I compiled for the Facebook discussion group. Since their first publishing  I have added some enhancements. In these notes I briefly try to get a handle on the irregularities and erratics of human epistemic behavior. Epistemic intransigence may, statistically speaking, have some kind of pay-off. That reference to statistics is important, because it means that the pay-off comes some of the time, but not all of the time!


***


Some quick notes here. A systems and information approach to this question of noetic inertia is the first thing I think of:

Holding out against cognitive updates may have a hidden rationality, just as there is often rational resistance to the next super-duper version of Windows!

The acquisition of information by human beings is slow and time consuming and the rate of “dot joining” of that information into a cognitive synthesis is probably even slower. This represents a high investment in time and processing power which makes chopping and changing prohibitively “expensive” in terms of processing resources. E.g. if I had learnt to use the Ptolemaic universe all my life and still get useful results out of it then I am not likely to change to the Copernican Solar System in a hurry.

Each person has their own unique epistemic history. This history may not be linear, meaning that A+B is not equal to B+A where the “plus” signs are used to represent some cognitive synthesizing activity on the arrival of the information first in the order A & B and then in the order B & A. Proprietary histories may also be caused by two observers having datasets that only partially overlap. In fact it may be impossible to achieve complete overlap because first person experiences may not be shareable.

Resistance to changing an already costly cognitive synthesis reminds me of the scientific resistance to observational anomalies that don’t quite fit a well-established theory. Until the anomalies build up to an intolerable degree the old theory may survive under the hypothesis that the anomalies are due to some unexplained aberration that has so far not been thought of. After all, the universe is open ended as far as we are concerned and may throw up the unexpected. Unwillingness to change a cognitive synthesis on the basis of a few contrary “facts” may be recognition that data can sometimes be misleading.

Many experiences are not repeatable. For example someone may ardently believe in ghosts or UFOs based on some vivid experience that I don’t share and which I can’t get to the bottom of; hence it may never be possible to agree because the experience can’t be reproduced and shared.

The above cognitive factors may explain why there are deeper reasons for the resistances you list as 1 and 4 in your post: As Pascal said “The heart has reasons that the reason cannot know.”

Points 2 & 3, the tribal/social factors, are interesting and again may have their rationality in terms of group protection/benefits: Given the cognitive expense in a) taking on-board information and b) processing that information to arrive at a cognitive synthesis this tribal factor might be an attempt to outsource these expensive processing activities to others in a trusted group: This very much depends on relationship bonding and trust between group members. Hence, the epistemic method here is that one gets to know and trust the group members rather than directly process the data about the subject in question! i.e. one is delegating the research processing to another trusted group member. Forming a trust bound is part and parcel of an instinctual socially based epistemic! This one is very frustrating when you face, say, a fundie and find that he doesn’t accept your views no matter how hard you reason – he simply doesn’t believe you have the right to instruct him! In any case why trust a stranger with strange ideas? One has to have a particularly low self esteem to go down that route without resistance!

If we can see a modicum of rationality behind even what sometimes appear to be the most outrageous and irrational beliefs it might help us to get a little less uptight about disagreement. I say that as researcher of fundamentalism: If I got too uptight about some of the wackaloons who have popped up in my field of view I think the men in white would have come and locked me up long ago!

No comments:

Post a Comment