RE: Spacetime/Inflation/Civilizations

From: Lee Corbin (lcorbin@tsoft.com)
Date: Fri Mar 14 2003 - 19:25:13 MST

  • Next message: Dehede011@aol.com: "Re: PEACE?: Gotta love those polls!"

    Wei Dai writes

    > On Tue, Mar 11, 2003 at 09:46:30AM -0800, Lee Corbin wrote:
    > > We replace *value* by *benefit* in our analysis. It's clear
    > > (or, at the rate we are making progress here, it soon will be)
    > > that you experience twice the benefit if you have twice the
    > > number of copies running in a given volume of spacetime.
    >
    > I'm not sure what distinction you're making between value and benefit.

    Values are arbitrary (in a sense), while benefit is objective.
    For any given entity, especially an evolutionarily derived one,
    whether or not that entity benefits from certain conditions is
    more or less objective. Of course, this leaves aside very
    problematic instances, but whether or not a person is *warm*
    also has its problematic cases---yet it can be truly and
    objectively determined in the vast majority of cases whether
    or not an organism is warm enough.

    Now *one* important aspect of benefit is the relation between
    a supposed benefit and the value system of the entity in question.
    In many of the problematical instances, it is exactly a conflict
    on this issue that is important. For example, is free education
    a true benefit? Or is it of any benefit? Opinions among teens
    will differ, and their own disparate value systems may ultimately
    decide. One person might conclude that a good education has very
    little to do with what he wants out of life, another might value
    it, but doesn't subscribe to having a time horizon (i.e., he
    discounts it so heavily) that makes it worthwhile. In addition,
    for some, they will be, objectively speaking, making a mistake,
    as might be appreciated a few years hence when certain realities
    or data has come to their attention.

    > And just how will it become clear that I experience twice the benefit if I
    > have twice the number of copies running in a given volume of spacetime?
    > Will I, after running multiple copies of myself, suddenly realize the
    > truth of this statement as an epiphany?

    ;-) No, of course not. I will use my Mexican box car example.
    In the summer of 1988 forty illegal Mexican immigrants were
    locked in a railroad boxcar by their coyote, and it sat several
    days under the hot Texas sun. By the time the door was opened,
    quite a few were dead, and the rest horribly dehydrated. Now
    consult your own value system on an imaginary trip to the
    vicinity where all this suffering is taking place:

    You approach the boxcar and hear the pants, heavy breathing
    and cries for help. Would you be any less moved if you
    learned that a light year away precisely the same events
    were unfolding? I think that you, like me, would intervene
    and reduce the suffering even if it meant that a light year
    away events would unfold precisely as they did.

    From this we are lead to consider our "approval function".
    Your approval of any process in spacetime, it seems to me,
    must be additive. If you do not have such an additive
    function, then there is no reason to intervene on behalf
    of the Mexicans in *this* particular boxcar (the one in
    whose proximity you are).

    One then merely applies an objectively calibrated approval
    function to one's self. I take it that speaking purely in
    terms of physics, a normal healthy Wei Dai approves of Wei
    Dai type processes having a good time, or learning something,
    or in some other way receiving benefit. In terms of physical
    processes, what you are really approving of is a class of
    information flow and calculation. In one enormous class of
    possible processes of Wei Dai type organisms, good things are
    happening to WD, and in another huge class, bad things. If
    your approval function is to be consistent---and that's really
    all we can ask in the final analysis---then from the foregoing
    it seems inescapable to me that you must approve twice as much
    of two Wei Dai's receiving benefit as one.

    > We know that if you run the same computation twice in a row, it's usually
    > faster the second time because some of the necessary data will already be
    > in cache. So with the same resources, the number of identical computations
    > you can run can be higher than the number of different computations you
    > can run.

    A very interesting point. The problem, however, is that you are
    cheating a little by retaining intermediate stages and thus not
    re-doing the entire calculation. This is the case of lookup tables
    again. I claim, and have always claimed, that pure lookup tables
    are not conscious and can receive no benefit.

    What one can have, of course, is a mixture, such as I think you
    are suggesting here. Suppose that a TM has only every other
    step calculated, with the other half of the steps merely looked
    up. Would one experience only half the pain, or half the
    pleasure of life? I don't know about *quantifying* it in
    exactly that way, but yes, it has to be something like half.

    > With that in mind, consider the following thought experiment. Suppose you
    > run a charity, which provides memory and computational power to those who
    > can no longer afford enough to sustain their minds. Let's say you currently
    > serve 4 people, all of whom are living in separate closed virtual worlds,

    and by that I take it that these are causally disjoint calculations

    > and because you respect their privacy, you have no idea what's going on
    > inside these worlds. Three of them are identical and deterministic (so
    > they'll always be identical). Now suppose due to a budget crunch, you can
    > only support either 3 identical worlds, or 2 different worlds. Would you
    > always choose to terminate the one world that's different?

    Yes, provided that the lives of the inmates are worth living,
    i.e., that they are receiving benefit and not being harmed.
    And I will assume that, given your stipulation that I know
    nothing about them.

    (Conversely, suppose their lives were not worth living, perhaps,
    we conclude, because they are in terrible pain even though they
    are so delusional that they don't even realize it. In this case,
    three separate bad experiences would be preferable to four
    identical bad experiences, ignoring, of course, that we should
    pull the plug on them altogether, given that that was a choice.)

    (Since that paragraph contains certain incendiary elements, let me
    say that he who foots the bill has, IMO, final say in such matters.)

    > Or if you were to receive a budget increase, would you always
    > just make more copies of the person you already have, rather
    > than consider new applications?

    If it's cheaper, then by all means. I have never understood
    why the elegance of the view endorsing repeated experience
    doesn't appeal to people. (My own private theory is that they
    have not completely rooted out some feeling or notion of Cosmic
    Purpose to their own existences.)

    Lee



    This archive was generated by hypermail 2.1.5 : Fri Mar 14 2003 - 19:26:06 MST