Q: Simulation checking

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Jun 07 2003 - 00:16:14 MDT

  • Next message: Robert J. Bradbury: "Re: ARTERIES Engineered-Non-Neonatal"

    Harvey Newstrom wrote:
    >
    > You need a professional hacker! They are great for figuring out how to
    > detect, probe, evaluate, and ultimately manipulate remote unseen
    > computing forces by using ordinary communications and interactions in
    > such a way as to get unexpected results. If we are in a simulation, a
    > hacker should be able to figure it out.
    >
    > (Hmmm... unless the hackers capable of doing this are programmed to
    > disbelieve that we are in a simulation so they never try! Nah...!)

    Well, then...

    Let's suppose that we're in a simulation. Any number of simulations,
    perhaps, since you can't distinguish between the pointer states. Some
    simulations go all the way down to the quantum level; others are only as
    detailed as they need to be.

    Supposing that "this" world is a low-resolution sim, what kind of action
    would it take to force the simulator to compute you in greater detail?
    Let's say that I have a small electronic voice recorder. If I speak into
    the voice recorder, download the audio file to my computer, and examine
    its internals bit by bit, then for the simulation of my high-level self to
    be accurate and fairly representative of probability, the simulation must
    have been accurate enough to determine any sensitivies of the voice
    recorder to minor subtleties of tone, background noise, and so on.
    Otherwise I'd see a '0' instead of a '1' when viewing the hex dump, which
    makes a gross difference to my cognitive processing.

    Aside from speaking into a voice recorder, does anyone else have any
    suggestions for creating a permanent record of an event which would force
    that event to be simulated in greater detail? In particular, such that
    for the simulator to get a historically accurate probability distribution
    on the gross characteristics of the permanent record created (its ones and
    zeroes, which would be later examined), the simulator would find it useful
    to simulate the original event in greater-than-usual detail.

    Note that we are talking about making it *useful* for the simulator to
    expend more computing power, to simulate to a greater depth, in order to
    get a *more accurate* picture of the *probability distribution* of the
    ones and zeroes in the permanent record. This means that the probability
    distribution of the permanent record depends on fine physical details in a
    way which can be usefully refined by expending more computing power.
    Assume that the initial conditions are 'spread out' evenly over whatever
    space of possibilities is permitted by the state of the existing
    low-resolution simulation at the moment when it begins modeling at greater
    detail, so that what we want is not just sensitive dependency on initial
    conditions, but the simulator's ability to get a more historically
    realistic picture of the probability distribution, after dropping from the
    previous low resolution into the evenly distributed high-resolution
    initial conditions, by expending more computing power to model the
    convergence of the development of those initial conditions at a finer
    level of detail. So what we want is a physical process, which converges
    to some degree, which creates sensitive dependency in the permanent
    record, such that a noticeably more refined picture of the probability
    distribution of the permanent record can be obtained by expending more
    computing power to simulate that moment at a finer level of resolution.

    Or if that's too complicated: some way of creating a permanent record that
    sensitively depends on small details of the sim - either expensively,
    using off-the-shelf equipment, or (bonus points) using stuff lying around
    the house.

    -- 
    Eliezer S. Yudkowsky                          http://singinst.org/
    Research Fellow, Singularity Institute for Artificial Intelligence
    


    This archive was generated by hypermail 2.1.5 : Sat Jun 07 2003 - 00:26:27 MDT