Meccano Microcosm: Obsessed with the mystery of randomness, I made this machine blending order and randomness at the end of the 1970s as an agreeable pass time. See here for the story
I have recently compiled a book-length electronic document containing some of my earlier writings on disorder and randomness. This compilation can be accessed as a PDF here I worked on this subject as a hobby throughout the 1970s and 1980s and I now publish what was originally a typescript. I have, however, enhanced the content as well as the format. The current publication is edition 1, but it is likely that I will re-read the document in a few months time and produce edition 2. After a while I get bored with reading the same old text and I have to wait some weeks before the desensitization wears off and I can continue to spot errors, omissions and naiveties.
Below I reproduce the introduction to the book:
Introduction
This book deals with the subject of
randomness. It presents a personal record of my engagement with a topic which,
early in my thinking career, presented itself as something of fundamental
importance: The algorithmic nature of deterministic physical laws seemed
relatively easy to grasp, but what was this thing called indeterminism? How
would we know when it was operating?
Above all, what was this strange discipline of predictive statistics? It
was strange because no one expects its predictions to be exactly right. A naive
philosophical stance might think that a miss is as good as a mile and therefore
anything registering less than 100% truth as nothing short of 100% error; but
no, in statistics the concepts of approximation and nearness figure prominently
and give the lie to fundamentalist sentiments that anything less than absolute
certainty isn’t worthwhile knowledge.
In the short Sherlock Holmes story The Cardboard Box, Holmes solves a particularly
tragic crime of passion; both the killer and his victims are ordinary people
who against their better judgement are driven by overwhelming human emotions
into a conflict that leads to murder. If you got to know these people you would
find none of them to be particularly immoral by human standards. In a cool
moment they would likely condemn their own hot blooded actions. After bringing
the matter to a conclusion in his usual consummate and controlled way, Holmes
betrays a rare moment of being completely at a loss as to understanding the meaning
of the affair. The details of the crime he guided us through, where all the
clues formed a rational pattern which Holmes alone completely understood, stand
in stark contrast to his utter mystification at the human significance of the
story as a whole:
“What is the meaning
of it, Watson?” said Holmes, solemnly, as he laid down the paper. “What object is served by this circle of
misery and violence and fear? It must tend to some end, or else our universe is
ruled by chance, which is unthinkable. But to what end? There is the great
standing perennial problem to which human reason is as far from an answer as ever.”
In this expression of the problem of
suffering and evil, chance and meaninglessness are seen as be related: A world ruled
by chance is purposeless, or at least it is difficult to discern purpose. Although
the exorable mechanisms of determinism can themselves conspire with chance to
generate meaningless scenarios lacking in anthropic significance, the
background fear expressed by Holmes is that chance is ultimately sovereign. For
Conan Doyle’s Victorian God-fearing character this was unthinkable. Today, of
course, it is no longer unthinkable, although for many a very uncomfortable
thought. But what is “chance” and just why should it come over as the epitome
of meaninglessness and purposelessness? Since the rise of the quantum
mechanical description of reality chance figures very strongly in today’s
physical science and raises profound questions about the nature and meaning (or
lack of meaning) of chance’s manifestation as randomness in the physical world.
Arthur Koestler in his book The Roots of Coincidence spelled out
some of the paradoxes of chance events and their treatment using probability
theory. In his book Koestler’s interest in the subject arises from his interest
in those extrasensory perception experiments which make use of card guessing statistics
and where there is, therefore, a need to determine whether the outcome of such
experiments is statistically significant. Perhaps, suggests Koestler, the
apparently skewed statistics of these experiments are bound up with the very question
of just what randomness actually means. After all, in Koestler’s opinion
randomness itself seems to have a paranormal aspect to it. To illustrate he
gives us a couple of examples where statistics has been used to make predictions:
In one case statistics was used by a German mathematician to predict the distribution of the number of German
soldiers per year between 1875 and 1894 kicked to death by their horses. In another
case Koestler remarks on the near constancy of the number of dog-bites-man
reports received by the authorities in New York. In response to the idea that
probability theory leads to an understanding of this kind of “statistical
wizardry”, as Koestler calls it, he says this:
But does it really
lead to an understanding? How do those German Army horses adjust the frequency
of their lethal kicks to the requirement of the Poisson equation? How do the
dogs of New York know that their daily ration of biting is exhausted? How does
the roulette ball know that in the long run zero must come up once in thirty seven
times if the casino is to be kept going? (The Roots of Coincidence, Page 27)
I have to admit that for some time after
reading Koestler I too was puzzled by the success of probability theory’s
paradox of being able to predict the unpredictable. At the time I knew how to
use probability calculus but I had really no idea how and why it worked. Later
on in his book Koestler introduces us to mathematician G. Spencer Brown who was
researching probability in the 1950s. Of him Koestler says:
...among
mathematicians G. Spencer Brown proposed an intriguing theory which attempted
to explain the anti-chance results in card guessing experiments by questioning
the validity of the concept of chance itself. (The Roots of Coincidence, Page 102)
Koestler tells us that Spencer Brown proposed
that ESP statistics “pointed to some anomaly
in the very concept of randomness”. Koestler goes on to quote Sir Alister
Hardy who sponsored Spencer Brown:
…It remained for Mr.
G. Spencer Brown of Trinity College, Cambridge, to suggest the alternative and
simpler hypothesis that all this experimental work in so called telepathy,
clairvoyance, precognition and psycho-kinesis, which depends upon obtaining
results above chance, may be really a demonstration of some singular and very different
principle. He believes that it may be something no less fundamental or
interesting – but not telepathy or these other curious things – something
implicit in the very nature and meaning of randomness itself… In passing let me
say that if most of this apparent card-guessing
and dice influencing work should
turn out to be something very different, it will not have been wasted effort;
it will have provided a wonderful mine of material for the study of a very
remarkable new principle. (The Roots of Coincidence, Page 103).
Koestler says that Spencer Brown’s work petered
out inconclusively. But whilst we continue to feel that we really don’t understand
why it is that probability theory works then Brown’s ideas remain plausible: Perhaps
we’ve got probability theory wrong and this accounts for the unfamiliar
statistics which apparently show up in paranormal research involving card
guessing and dice influencing. Whether or not there was something to Brown’s
ideas, at the time I first read Koestler in the early 1970s I couldn’t tell.
But for me the “near miss” effectiveness of probability calculus was a nagging
problem that remained with me for some time. At university I had taken it for
granted that there was no mystery in probability theory, but a social science
student whose name I have long since forgotten challenged me with Koestler’s
comments. These comments made me realize that I although I could use
probability calculus I didn’t really understand why it worked. However, by the
time I finished the work in this book, although inevitably there were still
questions outstanding, I felt fairly confident that the kind of thing Spencer
Brown was looking for did not in fact exist if true randomness was in operation.
In the epilogue I briefly take up this matter along with the question raised by
Sherlock Holmes.
During the mid-seventies I had dabbled a little
with quantum theory with the aim of trying to make anthropomorphic sense of it,
but after a while I shelved the problem for later. (The sense I did eventually
make of it is an ongoing story I tell elsewhere). However, in the 70s my dabblings
with quantum theory gave way to what was in fact, for reasons I have already
given, a philosophically more pressing problem; Viz; the nature of randomness
and probability. In any case the probability question arose very naturally
because probability is at the heart of quantum theory; if I didn’t understand
the meaning of the language of probability in which quantum mechanics was
formulated how I could understand the nature of quantum theory? The gradual
shift in my interests from quantum theory to probability started to produce results
immediately. I pursued the idea of taking disorder
maxima of binary sequences and rediscovered the statistics of Bernoulli’s
formula by way of a very long-winded maximization proof. But as I continued
with this line of inquiry it become increasingly apparent that probability and randomness
are not the same thing: Probability is to do with our information or knowledge
about systems and, in fact, arises in connections where randomness is not
present (I have presented more on the subject of probability in my 1988 paper on probability). Randomness, on the other hand, is to do
with patterns, specifically, as we shall see, patterns about which practical algorithmic methods of
prediction reveals only maximum disorder.
This means that a pattern can be disordered and yet not probabilistic; for
example, if we have a book of random numbers generated, say, by some quantum
process, the pattern is going to be disordered or random, but in the sense that
the book has captured the pattern and it can now be thought of as part of human
knowledge this pattern is no longer probabilistic.
As I propose in the epilogue of this book, the
reason for the close connection between probability and disordered patterns
such as generated by quantum processes is that disorder is epistemically intractable
until it becomes a recorded happened event; for as this work attempts to
demonstrate randomness is not practically
predictable by algorithmic means; therefore up until the random pattern is
generated and recorded it remains unknown to those who only have available
practical methods of making predictions. A random pattern is thereby
probabilistic by virtue of it being intractable to practical algorithmic
precognition.
The bulk of the mathematics in this book was
written in the late seventies and eighties. I eventually produced a typescript
document in 1987 and the contents of this typescript provide the core of the
book. Like most of my work, all of which is a product of a private passion,
this manuscript will be of no interest to the academic community as they have
their own very professional way of answering similar questions; the book presents the record of a science hobbyist. As such it is only of
autobiographical interest.
No comments:
Post a Comment