Re: Qualia and the Galactic Loony Bin

Eliezer S. Yudkowsky (
Wed, 23 Jun 1999 17:35:23 -0500

Darin Sunley wrote:
> The problem with classifying theories of consciousness into "causal" vs.
> non-causal is we dno't have much better definitions of causal then we do for
> consciousness. (Eliezer, any idea how to build a computer that can understand
> causal relations?)

Oh, yes. All *kinds* of ideas. That's the whole problem. It's not just that I failed to come up with a definition of instantiation. I succeeded, but it was a *cognitive* definition instead of a mathematical one, complete with things like salience, relevance to a particular problem, probabilistc conclusions, and so on - it was utterly unamenable to any sort of formalization. Furthermore, the cognitive model was such a good fit to our intuitive understanding of Turing computations that I concluded that I had no reason to believe our model of Turing computations derived from anything else. I was looking at the foundations of the Turing formalism and they were cognitive elements, not formal tokens, and could never be formalized. Hence my belief that Turing computations are intrinsically observer-dependent.

I think my foremost example will continue to be a person dropping a cup. If you startled me, I'll say it was because you startled me. If you don't want to take the blame, you'll say it was because I relaxed my hand. If I grew up in zero gee, I'd say it was because of gravity. If I was expecting a robot to dash by and catch the cup, I'll say it was "because a robot didn't dash by and catch the cup". If I'm a Power, I might just blame it all on the initial configuration at the Big Bang.

You run into the same problem every time you try to say: "This chip represents Turing machine X, in that electrical charge A represents a _1_, which was acted on by this transistor to produce an electrical charge B representing a _0_ here." But then you might as well map A to _0_ and B to _0_.

And if you then say: Ah, but this transistor then acts on electrical charge B, representing a _0_, to change electrical charge A to a _0_; whereas if electrical charge B had been _1_, it would have changed B to a _0_ instead of changing A? If, in short, you have the apparently observer-independent, and perhaps even counterfactual-independent, phenomenon of a piece of data determining which previously referenced piece of data will be acted upon? If you define data, not in terms of arbitrary mappings, but in terms of how that data affects the pattern of causal connections?

But then you might as well say that the _0_ was produced because an asteroid didn't crash into the computer, having nothing to do with any previously referenced electrical charges. There are no observer-independent "causal connections" that can be used to replace observer-dependent mappings. There is only an indistinguishable blur of causal contacts that extends throughout our entire past light cone.

You see, I really did try to define instantiation. It just can't be done.

           Eliezer S. Yudkowsky
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way