Pages

Thursday, September 27, 2007

The Virtual World

Why don’t I accept the multiverse theory, I have to ask myself. One reason is that I interpret quantum envelopes to be what they appear to be: namely, as objects very closely analogous to the guassian envelopes of random walk. Guassian random walk envelopes are not naturally interpreted as a product of an ensemble of bifurcating and multiplying particles (although this is, admittedly, a possible interpretation) but rather a measure of information about a single particle. ‘Collapses’ of the guassian envelope are brought about by changes in information on the whereabouts of the particle. I see parallels between this notion of ‘collapse’ and quantum wave collapse. However, I don’t accept the Copenhagen view that sudden jumps in the “state vector” are conditioned by the presence of a conscious observer. My guess is that the presence of matter, whether in the form of a human observer or other material configurations (such as a measuring device) are capable of bringing about these discontinuous jumps. (Note to self: The probability of a switch from state w) to state v) is given by (wv) and this expression looks suspiciously like an ‘intersection’ probability.)

The foregoing also explains why I’m loath to accept the decoherence theory of measurement: this is another theory which dispenses with literal collapses because it suggests that they are only an apparent phenomenon in as much as they are an artifact of the complex wave interactions of macroscopic matter. Once again this seems to me to ignore the big hint provided by the parallels with random walk. The latter lead me to view wave function ‘collapse’ as something closely analogous to the changes of information which take place when one locates an object; the envelopes of random walk can change discontinuously in a way that is not subject to the physical strictures on the speed of light and likewise for quantum envelopes. My guess is that the ontology of the universe is not one of literal particles, but is rather an informational facade about virtual particles; those virtual particles can’t exceed the speed of light, but changes in the informational envelopes in which the virtual particles are embedded are not subject to limitations on the speed of light.

Monday, September 24, 2007

Well Versed

I have been delving into David Deutsch’s work on Parallel universes. Go here, to get some insight into Deutsch’s recent papers on the subject. I’m not familiar enough with the formalism of quantum computation to be able to follow Deutsch’s papers without a lot more study. However, some salient points arise. In this paper dated April 2001 and entitled “The Structure of the Multiverse” Deutsch says:
“.. the Hilbert space structure of quantum states provides an infinity of ways of slicing up the multiverse into ‘universes’ each way corresponding to a choice of basis. This is reminiscent of the infinity of ways in which one can slice (‘foliate’) a spacetime into spacelike hypersurfaces in the general theory of relativity. Given such a foliation, the theory partitions physical quantities into those ‘within’ each of the hypersurfaces and those that relate hypersurfaces to each other. In this paper I shall sketch a somewhat analogous theory for a model of the multiverse”
That is, as far as I understand it, Deutsch is following the procedure I mentioned in my last blog entry – he envisages the relationships of the multiple universes similar to the way in which we envisage the relationships of past, present and future.

On the subject of non-locality Deutsch in
this paper on quantum entanglement, states:
“All information in quantum systems is, notwithstanding Bell’s theorem, localized. Measuring or otherwise interacting with a quantum system S has no effect on distant systems from which S is dynamically isolated, even if they are entangled with S. Using the Heisenberg picture to analyse quantum information processing makes this locality explicit, and reveals that under some circumstances (in particular, in Einstein-Podolski-Rosen experiments and in quantum teleportation) quantum information is transmitted through ‘classical’ (i.e. decoherent) channels.”
Deutsch is attacking the non-local interpretation of certain quantum experiments. In
this paper David Wallace defends Deutsch and indicates the controversy surrounding Deutsch’s position and its dependence on the multiverse contention. In the abstract we read:

It is argued that Deutsch’s proof must be understood in the explicit context of the Everett interpretation, and that in this context it essentially succeeds. Some comments are made about the criticism of Deutsch’s proof by Barnum, Caves, Finkelstein, Fuchs and Schack; it is argued that the flaw they point out in the proof does not apply if the Everett interpretation is assumed. "

And Wallace goes on to say:

“…it is rather surprising how little attention his (Deutsch’s) work has received in the foundational community, though one reason may be that it is very unclear from his paper that the Everett interpretation is assumed from the start. If it is tacitly assumed that his work refers instead to some orthodox collapse theory, then it is easy to see that the proof is suspect… Their attack on Deutsch’s paper seems to have been influential in the community; however, it is at best questionable whether or not it is valid when Everettian assumptions are made explicit.”

The Everett interpretation equates to the multiverse view of quantum mechanics. Deutsch’s interpretation of QM is contentious. It seems that theorists are between a rock and a hard place: on the one hand is non-locality and absolute randomness and on the other is an extravagant ontology of a universe bifurcating everywhere and at all times. It is perhaps NOT surprising that Deutsch’s paper received little attention. Theoretical Physics is starting to give theorists that “stick in the gullet” feel and that’s even without mentioning String Theory!

Saturday, September 22, 2007

Quantum Physics: End of Story?

News has just reached me via that auspicious source of scientific information, Norwich’s Eastern Daily Press (20 September) of a mathematical break through in quantum physics at Oxford University. Described as “one of the most important developments in the history of science” my assessment of the report is that multiverse theory has been used to derive and/or explain quantum physics.

The are two things that have bugged scientists about Quantum Physics since it was developed in the first half of the twentieth century; firstly its indeterminism – it seemed to introduce an absolute randomness in physics that upset the classical mentality of many physicists including Einstein: “God doesn’t play dice with the universe”. The second problem which, in fact is related to this indeterminism, is that Quantum Theory suggests that when these apparently probabilistic events do occur distant parts of the universe hosting the envelop of probability for these events, must instantaneously cooperate by giving up their envelope. This apparent instantaneous communication between distant parts of the cosmos demanding faster than light signaling also worried Einstein and other physicists.

Multiverse theory holds out the promise of reestablishing a classical physical regime of local and deterministic physics, although at the cost of positing the rather exotic idea of universes parallel to our own. It achieves this reinstatement, I guess, by a device we are, in fact, all familiar with. If we select, isolate and examine a particular instant in time in our own world we effectively cut it of from its past (and future). Cut adrift from the past much about that instant fails to make sense and throws up two conundrums analogous to the quantum enigmas I mentioned above; Firstly there will be random patterns like the distribution of stars which just seem to be there, when in fact an historical understanding of star movement under gravity gives some insight into that distribution. Secondly, widely separated items, will seem inexplicably related – like for example two books that have identical content. By adding the time dimension to our arbitrary time slice the otherwise inexplicable starts to make sense. My guess is that by adding the extra dimensions of the multiverse a similar explanatory contextualisation has finally – and presumably tentatively - been achieved with the latest multiverse theory.

Not surprisingly the latest discovery looks as though it has come out of the David Deutsch stable. He has always been a great advocate of the multiverse. By eliminating absolute randomness and non-locality multiverse theory has the potential to close the system and tie up all the lose ends. Needless to say all this is likely to proceed against a background of ulterior motivations and may well be hotly contended, not least the contention that Deutsch has made the greatest discovery of all time!
Postscript:
1. The tying of all loose ends is only apparent; all finite human knowledge can only flower out of an irreducible kernel of fact.
2. Multiverse theory, unlike the Copenhagen interpretation of Quantum Mechanics, suggests that quantum envelopes do not collapse at all, but always remain available for interference. Hence it should in principle be possible to detect the difference between these two versions of Quantum Theory experimentally.