RE: Subjective counterfactuals

Billy Brown (bbrown@conemsco.com)
Mon, 5 Apr 1999 13:49:21 -0500

Eliezer S. Yudkowsky wrote:

> Now let's say we run the playback, and, at each step, consult the
> original state transition diagram to find out what the results *would*
> have been, but then discard that result and load in the tape. In other
> words, we compute each step, at each point along the recording, but we
> don't connect them causally to each other; we discard the result and
> load the recorded step, even though the two happen to be identical.

If this simulation is based on a human (or even animal) mind, the two states will almost never be the same. Most of what happens in the mind is probabilistic, not mechanistic - so if you simulate the same 100ms of processing a million times, you will get a million slightly different end states.

Now, I don't know whether this is a necessary property of a conscious mind or not, but it seems entirely plausible that it might be. If it is, then you need a Turing machine with a random number generator to run a mind as software, and the whole system has very different properties than an algorithmic program. However, we don't need to get lost in metaphysics to figure out how this kind of system will behave - we just have to accept that we can't predict its future states with any great degree of precision.

I don't think this observation actually solves the problem at hand, but it does close off a lot of blind alleys.

Billy Brown, MCSE+I
bbrown@conemsco.com