Re: Spacetime/Inflation/Civilizations

From: Rafal Smigrodzki (rafal@smigrodzki.org)
Date: Thu Mar 06 2003 - 20:18:08 MST

  • Next message: Michael M. Butler: "Re: Mild psychoactive drugs"

    ----- Original Message -----
    From: "Hal Finney" <hal@finney.org>

    >
    > Suppose we are going to flip a biased quantum coin, one which has a 90%
    > chance of coming up heads. We will generate the good or bad experience
    > depending on the outcome of the coin flip. I claim that it is obvious
    > that it is better to give the good experience when we get the 90% outcome
    > and the bad experience when we get the 10% outcome. That's the assumption
    > I will start with.
    >
    > Now consider Tegmark's level 1 of parallelism, the fact that in a
    > sufficiently large volume of space I can find a large number of copies
    > of me, in fact copies of the entire earth and our entire visible universe
    > (the "Hubble bubble"?). When I do my quantum coin flip, 90% of the copies
    > will see it come up heads and cause the good experience for the subject,
    > and 10% will see tails and cause the bad experience.
    >
    > I will also assume that my knowledge of this fact about the physical
    > universe will not change my mind about the ethical value of my decision
    > to give the good experience for the 90% outcome.
    >
    > Now the problem is this. There are really only two different programs
    > being run for our experimental subject, the guy in the simulation. One is
    > a good experience and one is bad. All my decision does is to change how
    > many copies of each of these two programs are run. In making my decision
    > about which experiences to assign to the two coin flip outcomes, I have
    > chosen that the copies of the good experience will outnumber copies of
    > the bad experience by 9 to 1.

    ### One way of avoiding the conundrum is to use Rawl's veil of ignorance,
    put yourself in the position of the experimental subject, and consider one
    additional option - not flipping the coin at all. You can decide to run the
    good experience only, giving the test subject a certainty of good experience
    (limited only by your lack of knowledge about varous interfering factors,
    computer glitches, etc.). The test subject if asked about the correct way to
    run the experiment will insist on skipping the coin flip. From this it
    follows you should not use the random coin flip as an addtional source of
    uncertainty, because in this way you reduce the expected utility of your
    actions, both as measured by local Bayesian inference and as a measure of
    outcomes over the infinite number of your level I copies. The same reasoning
    applies to level II.

    In level III there is the additional wrinkle of your actions causally
    influencing the quantum evolution of the whole experimental system.
    Basically, when making the decision you decohere into three versions, or
    sheaves of histories - the no-coin-flip, the choose-90% and choose-10%
    histories (the last sheaf of histories is equivalent to you turning into a
    sadist, I think). Your subjective experience of making the decision is the
    equivalent of a quantum process of decoherence, whose outcomes will have
    different measures depending on the properties of the system (your brain).
    If your brain is influenced by my words, on the classical level, it will be
    accompanied by an increase in the measure of branches with the no-coin-flip
    experiment on the MWI, or level III analysis. In this case the the
    conclusion is still the same as in level I and II, you should skip the flip
    and choose good, in accordance with Tegmark's statement that level III does
    not increase the number of distinct entities (or types of particle
    configurations) compared to level I and II. There will be also an infinite
    number of level I copies of you decohering into the three sheaves, but
    without a direct causal relationship to each other.

    My intution is to give the sentiences you are responsible for the best
    experience you can give them, if you decide to produce them, and the
    decision to run them should be also acceptable to the sentiences
    themselves - if you know that they will not be willing to live and be happy
    even with the best you can offer, you should not make them at all. As aside
    to Lee, you are also not duty-bound to make them in first place.

    Rafal



    This archive was generated by hypermail 2.1.5 : Thu Mar 06 2003 - 20:22:54 MST