Re: Does the state-vector collapse?

From: Amara Graps (amara@amara.com)
Date: Wed Oct 04 2000 - 16:03:57 MDT


(Bringing this topic momentarily in from the back-burner.)

From: David Blenkinsop <blenl@sk.sympatico.ca>, Sat, 23 Sep 2000

>Amara Graps wrote:
>>
>> . . . If one adopts
>> a Bayesian approach to probability, then the Schroedinger wave
>> equation simply becomes a posterior probability describing
>> our incomplete information about the quantum system,
>> rather than wave functions that collapse in reality upon our
>> observation. It could clear up a lot of confusion.
>
>Just a minute, hold on there! As part of your discussion, you yourself
>mention the "amplitude squared" property of quantum waves, where the
>wave just *has* to be considered as have a steadily "rotating" complex
>vector amplitude. Now, it's my understanding that to get a normalized
>probability distribution for the location of the associated electron, or
>whatever, this really *does* involve squaring this complex amplitude, or
>it involves squaring the sum total of any such vectors as may be
>interfering in a given region of space. This is a situation quite unlike
>anything in regular probability theory, something quite different added
>in there, right?
>
>Say you've got a single photon waveform, and say this wave is travelling
>through two slits in a single photon, double slit experiment. The
>probability of the photon hitting a screen is then a kind of
>interference pattern of high and low probabilities as you move across
>the screen. The waveform itself must then actually be a real, extended
>structure of *some* kind, it seems to me, since it directly governs the
>probability of finding the photon in the various bands of the
>interference pattern.

In this case, the Bayesian approach is that they are in both states
at once, as a function of the superposition principle, and we are
lacking more variables to describe the scenario. If we interpret the
amplitudes instead, as energies that we are measuring, then the amplitude
squared is the probability that, if we measure its energy, we shall
find the value corresponding to the nth state.

However, I see your confusion. I didn't describe at any length where
Bayes' Theorem applies in Caticha's work, and indeed, I don't see
Bayes theorem in his articles. I apologize for that. I know that
Caticha comes from a Bayes perspective because I met him and we
spoke at a conference on Bayes applications (MaxEnt'98), and I know
that his colleague at the conference, C. Rodriguez (see below), also
researches quantum probabilities from a Bayes perspective.

The Bayes connection in Caticha's work comes in through his
references to R.T. Cox, and E. Jaynes, so it would would have more
accurate for me to say Caticha's subscribes to the "Bayesian
probability view" that probability as a measure of the plausibility
of an event with incomplete knowlege (which is different from the
frequentist or "sampling theory" view that probability as the
long-run frequency of occurrence of an event in a sequence of
repeated (sometimes hypothetical) experiment).

If you read Caticha's papers, you will see that they describe a real
experiment, so it is physical, and all propositions are testable. No
statements about the quantum particle can be made independently of
the experimental context (so you see, "priors" in the Bayesian sense
must be present). He doesn't separate the particle; describing what
it is doing between the source and the detector. He cannot say
whether the particle is a point particle or a wave or both or
neither. He is not saying that it went through either one hole or
through another, or through both holes at the same time. He has very
few assumptions: mostly that the particle is simply capable of being
emitted and detected.

So where does Bayes probability ideas come in? Now I go to a paper
by E.T. Jaynes, titled "Probability in Quantum Theory", and I pull
some things from that paper in my description below.

Much of physical science is information that is organized in
particular ways. We have a basic question, which is:

"To what extent does this information reside in us, and to what
extent is it a property of Nature? Any theory about reality can have
no consequences testable by us unless it can also describe what
humans can see and know." Jaynes says that the proper tool for
incorporating human information into science is simply probability
theory or "Bayesian inference". Probability theory is an extension
of logic, in order to reason in situations where we have incomplete
information.

However, amplitude Psi, according to our usual quantum mechanics
texts is not now to be interpreted in as expressions of human
ignorance of the true physical state. Jaynes says that it is our way
that we interpret the amplitudes that is causing the confusion. The
probabilities that we seek, must be in terms of mutually exclusive
possibilities and must be represented in a "deeper hypothesis space"
that contains more parameters, for example, phases. He doesn't go into
details of what all other parameters must be included, he mainly points
that our perspective has to shift into a wider view, and he says
that we cannot hope to get our probability connections right until we
get some basic points of logic right.

For example, this illogical situation:

Dispersions: (delta F)^2 = <F^2> - <F>^2 are thought by some
physicists to be quantum fluctuations in F that are *real physical
events that take place constantly whether or not any measurement is
being made.*

In basic probability theory, (delta F) represents fundamentally the
accuracy with which we are able to predict the value of F. This does
not deny that it may be also be the variability seen in repeated
measurements of F; but the point is that they need not be the same.
To say that they "must be the same" is what Jaynes calls the "Mind
Projection Fallacy", which is to suppose that creations of our own
imagination are real properties of Nature, or that our own ignorance
signifies some indecision on the part of Nature. If from our
information, we are able to determine F to only 5 percent accuracy,
that does not mean that the object fluctuates by 5 percent!

Jaynes has some interesting things to say in this paper about other
aspects of quantum mechanics such as the Zero-Point Energy and the
Lamb shift. and I encourage you to read his works. Much of it is
available on the Web, mostly here:

http://bayes.wustl.edu/etj/node1.html

Now back to Caticha.

Caticha's colleague, C. Rodriguez, a mathematician at SUNY Albany,
has a paper in the same text as Caticha's

Caticha, Ariel, _Probability and Entropy in Quantum Theory_ in
Maximum Entropy and Bayesian Methods_ (conf proceedings from
MaxEnt'98 conference, Garching, Germany, July 1998), Kluwer Academic
Publ., 1999.

titled:

"Unreal Probabilities": Partial Truth with Clifford Numbers". by C.
Rodriguez. (starting at pg. 247)

I will just quote from the beginning of the paper- it's highly
mathematical, and tough reading, and what follows is a bit abstract,
but straightforward.

{begin quote}
Probability theory was given a firm mathematical foundation in 1933,
when Kolmogorov introduced his axioms. By defining probability as an
"uninterpreted" special case of a positive measure with total unit
mass, the subject exploded with new results and found innumerable
applications, In 1946, Cox showed that the Kolmogorov axioms for
probability are really theorems that follow from basic desiderata
about the representation of particl truth with real numbers. We owe
to Ed Jaynes, to see the importance of Cox's 1946 work. After
Jaynes, it became clear why the calculus of probability is so
successful in the real world. Probability works because its axioms
axiomatize the right thing: partial truth of a logical proposition
given another. Even more, the rules of probability are unique in the
sense that any other set of consistent rules can be brought into the
standard sum and product rules by a change of scale. This is in fact
Cox's main result and it makes futile the enterprise of looking for
alternatives to the calculus of normalized real valued
probabilities. It is only by allowing the partial truth of a
proposition to be encoded by an object other than a real number in
the interval [0,1] that we could find alternatives to the standard
theory of probability.

We seek to find out what happens when standard probobaility theory
is modified by relaxing the axiom that the probability of an event
must be a real number in the interval [0,1]. We show that, by
allowing the measure of a proposition to take a value in a Clifford
Algebra, we automatically find the methods of standard quantum
theory without ever introducing anything specifically related to
nature itself.

The main motivation for this article has come from realizing that
the derivations in Cox still apply if real numbers are replaced by
complex numbers as the encoders of partial truth. This was first
mentioned by Youssef and checked in more detail by Caticha, who also
showed some non-relativistic Quantum theory, as formulated by
Feynman, is the only consistent calculus of probability amplitudes.
By measureing propositions with Clifford numbers we automatically
include the reals, complexes, quaternions, spinors and any
combinations of them as special cases.
{end quote}

Hope that helps explain a bit more.

Now (me) back to dust ....

Amara

*******************************************************************
Amara Graps | Max-Planck-Institut fuer Kernphysik
Interplanetary Dust Group | Saupfercheckweg 1
+49-6221-516-543 | 69117 Heidelberg, GERMANY
Amara.Graps@mpi-hd.mpg.de * http://www.mpi-hd.mpg.de/galileo~graps
*******************************************************************
        "Never fight an inanimate object." - P. J. O'Rourke



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:15 MDT