Re: Everett

Hal Finney (hal@rain.org)
Thu, 31 Jul 1997 22:40:09 -0700


Nicholas Bostrom, <bostrom@mail.ndirect.co.uk>, writes:
> Hal Finney <hal@rain.org> wrote (a very good account of the standard
> response by Everettians to the problem of how to make sense of the
> probabilities):
> > The idea is that as the state function evolves into a mixture of
> > non-coherent states, a measure function can be applied to the various
> > components of the mixture, based on the probability amplitudes. Everett
> > shows that, in the limit, the probability results observed from a series
> > of measurements will match the squared-amplitude of the relative state,
> > exactly as required by conventional QM; this is true except in a set of
> > branches of total measure zero. So you then only have to assume that
> > branches with amplitude zero never exist, and you derive that observed
> > probabilities will follow the predictions of conventional QM. (This is
> > the basis for DeWitt's claim, since he takes the premise as obvious.)
>
> Yes this is what they say. The problem is that the measure doesn't
> seem to correctly represent probabilities, if the world is as the
> Everett interpretation says it is. If there is exactly one version of
> me in a world where I measure Spin up, say, and exactly one version
> of me in a world where I measure Spin down (and no other versions of
> me, in this simplified example), then why is it that *this*
> version of me usually ends up being the version of me that obtains
> the outcome that quantum mechanics says is the most probable one?

I'm not sure what you are saying here, in speaking of three different
versions of yourself (the one which measures Spin up, the one which
measures Spin down, and *this* one).

You may be asking something like the following: I am about to make
a quantum measurement, which has two possible outcomes, spin up and
spin down. We have prepared the system so that quantum theory predicts
that the spin up outcome will occur with probability 99%, spin down with
probability 1%. I make the measurement, and as we know, most of the
time it comes out spin up.

Yet, in some versions of the many-worlds interpretation, the world is
said to split into two parts when I make the measurement, so afterwards
there are two instances of me where before there was one. Since there
are two, subjectively there should have been a 50% chance that I would
find myself in one particular one of them. This contradicts the 99%
chance which I actually observe.

I may be missing your point in simplifying the argument to this point,
because it does not seem to me to carry much weight. A priori there
is no more reason to expect the splitting phenomenon to give equal
probabilities to finding yourself in one of the two outcomes than to
give 99% or any other probability. The mere fact that one went to two
does not imply that the two outcomes have equal measure in any sense.

Actually, in Everett's paper he did not consider these kinds of discrete
measurements, which lend themselves to these somewhat misleading (IMO)
counting arguments. He discussed the continuous case, where the measurement
had an infinite number of possible outcomes. There, it is necessary to
use an integral rather than a simple sum to describe the state function.
And it is natural then to introduce a measure function to weight the
various branches of the state function. You then derive the probability
of an observed outcome by integrating the measure function over the range
of all parameters which produce that specified outcome.

A simple counting argument for numbers of worlds can't work if there
are infinite numbers in all classes. If you want to apply the counting
argument to the continuous case, what you have to do is in effect to come
up with your own measure function, one which is not based on the amplitude
but which is a constant. It is much less clear in this case that this
measure function is a priori the right one. Rather, the measure function
to be used must be specified explicitly as part of the interpretation.

It is not a strong argument to say that the Everett interpration must be
wrong because it uses a different measure function than one which may
strike you as a priori correct. Bryce De Witt thinks Everett's measure
function is a priori correct! Both views are mistaken, IMO.

> Imagine that at each point of time t we define the set Ct as
> containing all time slices of minds that exist at t. Then, as time
> goes by, the proportion of mind slices in Ct that would see quantum
> mechanical predictions verified experimentally would quickly become
> completely negligable; and yet, miraculously, it constantly turns
> out that I am in Ct! The improbabiliy that this should happen is the
> improbability that the Everett interpretation is true, if there is no
> way to escape this conclusion.

Again, to explain how I interpret this argument, we can consider the
measurement above to be repeated many times. If you assume that one
split occurs with each measurement, then after, say, 20 measurements,
only the branch which measured spin up every time has high probability.
Yet only one of the 2^20 possible branches is this one.

As before, the reply is simply that all the branches do not have equal
measure and equal probability. We must assume that the probability,
the weight, of a branch is measured by the square of the amplitude
associated with that branch.

As I wrote in my earlier message, this is an additional postulate which
must be added to the strict Schrodinger equation in order to derive
the observed results. I believe that it is nevertheless much simpler
and more plausible than the projection postulate of conventional QM,
which it replaces.

> > (You might think of this as indicating that the "degree of reality" of
> > a branch is proportional to the square of the amplitude.)
>
> But then I want the follower of Everett to explain to me what a
> degree of reality is, and more particularly why there is a greater
> probability that I should find myself in the world with the higher
> "degree of reality", when there are two real worlds and there is one
> version of me in one of them and one in the other.

Why ask why? This must be postulated. Is it plausible that subjective
probability follows the square of the amplitude? There are arguments
in favor of it. Everett showed that the only worlds which would observe
probabilities consistently violating the predictions of QM have amplitude
approaching zero. I think it is a plausible axiom to suggest that
worlds with amplitude zero are not experienced. But it is an axiom,
and it is not right to ask "why" it happens. You might as well ask why
the Schrodinger equation works.

> If sense cannot be made of "degrees of reality" (I'm not convinced it
> can't, but it's a very deep issue) then, it has been proposed,
> perhaps we should say that there are a greater *number* of worlds
> associated with more probable outcomes. This would solve the problem
> of probabilities, but now look at the ontology you have bought into.
> You have commited yourself to an *uncountable infinity* of worlds
> existing side by side, many of them identical! (Is it even meaningful
> to postulate two distinct worlds that are *exactly* similar.) All
> simplicity and plausibility seem to have been lost at this stage.

I've seen this suggestion, but I don't think it is economical or necessary.
In my opinion the measure/probability postulate solves the problem
adequately, based on the arguments I described above.

> > The bigger problem with the conventional interpretation IMO is that it is
> > inconsistent; the two different processes produce different results, and
> > it is not clear exactly when we should use one or the other.
>
> I'm not sure exactly what you referes to as the "conventional
> interpretation". Personally, I (also?) am unhappy with
> interpretations that appeal to the notion of measurement at a
> fundamental level. But we must not forget the possibility that
> something like the GRW interpretation turns out to be correct, so it
> would be all too hasty to jump to something really weird, especially
> if it doesn't even work.

I am referring to the "Copenhagen" interpretation, which suggests that
there are two kinds of processes which govern the time evolution of
dynamical systems: in isolation, the system follows the Schrodinger
equation, which describes the deterministic evolution of the state
function; and when measured, the system's state changes discontinuously
and randomly to one of the possible measured values, with probability
based on the square of the amplitude of that value's component in the
state function.

Hal