Re: Everett

Hal Finney (hal@rain.org)
Tue, 29 Jul 1997 11:09:28 -0700


John K Clark, <johnkc@well.com>, writes:
> In 1957 Hugh Everett solved the measurement problem but at a very high price,
> too high most thought. He said that when any particle undergoes the smallest
> detectable change ( a quantum jump ) the entire universe, including you the
> observer, duplicates itself and splits in two, but we have (almost) no way
> of communicating with the other universe. If true this would mean there is an
> infinite number (some say just an astronomical number) of universes.

I like to describe it in a slightly different way, which may not be exactly
consistent with Everett's argument but makes the essential insight clearer.

Normally, QM has two rules: one which describes the evolution of the
wave function over time under normal conditions, and the other, the
collapse phenomenon, which only occurs under poorly-defined measurement
conditions. Everett's real point is simply to take ordinary QM and remove
wave function collapse. What you find, surprisingly, is that when QM
phenomena are measured, the measuring device goes into a superposition
of states in which it displays all the possible measurements. However
these states are decohered to the extent that they have essentially no
interaction with each other. Therefore you can think of it as though
the measuring device (and the surrounding macroscopic universe) has now
been split into multiple parallel versions.

The point of this interpretation is not simply to say that the universe
splits when measurements occur. It is to say, let's investigate what the
world would be like if wave function collapse didn't occur. And it turns
out, when you work out the details, that you get this effect which is
very much like a universe splitting. And further, given that the various
disconnected parts of the wave function don't interact, what observers
seem to see is wave function collapse.

So by taking out wave function collapse, we get a universe in which wave
function collapse is observed to occur. Hence the two theories make the
same predictions, but Everett's has one less postulate. By Occam's razor
it should be preferred. (This is actually a bit of an oversimplification,
since Everett has to make some mathematical assumptions to make his
"apparent" wave function collapse have just the right probabilities,
and so the razor argument is not that strong.)

I like this way of looking at it better than just saying that the universe
splits on quantum events. The latter is really a deduction of the theory,
rather than a premise. And further, it is not a precise statement of
the circumstances described by the theory, which speaks in terms of
wave function components which are decoherent. Sometimes you'll hear
challenges like "Everett's theory is inconsistent with relativity, because
the whole universe changes instantaneously when a measurement occurs."
I think it is more useful to ask, "how does the quantum state function
change when a measurement-like interaction occurs, given the constraints
of relativity?"

Also, as I have described it, this is more than an "interpretation" of
QM. It is actually a different theory, with different (arguably simpler)
premises. It makes basically the same predictions, except that regular
QM has this sort of blind spot surrounding measurement interactions.
However, when measurements get so precise that complicated quantum
interactions are relevant, and the complex predictions of wave function
evolution would contradict a simplistic wave function collapse picture,
you can be sure that conventional QM will follow the Hamiltonian evolution
process.

I don't believe that Deutsch's thought experiment will truly be seen as
resolving the issue. Rather, everyone will agree on what this computer
will do. Some people may maintain though that the experiment does not
prove the existence of true parallel Everett worlds, any more than
quantum computers do. Rather, the phenomenon will be seen as an artifact
of the special characteristics of the quantum AI.

Hal