Re: Qualia and the Galactic Loony Bin

Eliezer S. Yudkowsky (
Wed, 23 Jun 1999 01:41:24 -0500

John Clark wrote:
> All of the really important things in life lack definitions but that doesn't
> seem to have cramped my style much. I've never seen a definition of "qualia"
> or "consciousness" that wasn't vague circular or just plain insipid, yet I
> know what the words mean, I just don't know how to communicate that meaning
> by banging on a keyboard or making noises with my mouth. Definitions
> are overrated, people seldom have them or need one.

Yes, I believe we've been here before. I defined "definition" and then you claimed that I hadn't really defined it, because my explanation required a being capable of understanding definitions and was therefore circular - which, when I think about it, means that I would have had to define "definition" to a rock.

Definitions come in two categories. The first is useless definitions somebody is using to advance an argument. The second is the "definition" that tells you how to build something. When I defined "definition", I was giving a capsule description of how to construct a mind that would accept definitions. We're looking for the definition of consciousness and qualia that will let us construct conscious, qualia-bearing minds. Do you really think this is insipid?

> >then there is no "fact of the matter" as to whether any given
> >computation is instantiated in any given system.
> If a computation (or a thought) does something like print a result or just
> open a switch then it's easy enough to tell when or if it happened. If I have
> no senses or actuators and am permanently isolated from the rest of the
> universe then it's meaningless to ask when a thought occurred, not just for
> an external observer but it's meaningless for me too. When did "green"
> happen?

Would you say that since this Universe (or Reality or whatever) has no input-output characteristics, it doesn't exist? If I toss you through an event horizon, do you stop being conscious as soon as you're cut off from the external world?

I do not understand this emphasis on input-output characteristics as an arbiter of consciousness. Hashtables are not people.

> So is an intelligent lookup table conscious?
> Certainly not, my brain is not conscious either although both imply
> Consciousness. Many (not all) of the problems surrounding this issue can
> Be avoided if you think of consciousness as an adjective not a noun.
> A racing car is not "fast" it goes fast, it's a demonstration of speed,
> and my brain is a demonstration, the only demonstration, of a particular
> sort of consciousness, the sort that calls itself "John Clark".

Does that mean that you would exist even in the absence of a particular demonstration? This is Platonic consciousness, right? I also agree that Turing computations are Platonic objects. I further say that either all Platonic objects exist, or none do - there's no is_real() function. Do you think that every possible version of yourself already exists and is just as real as you are?

> What about the intelligent lookup table?
> That would demonstrate the consciousness of the person who made the table.

And if I keep generating lookup tables via quantum randomness for 3^^^^3 years, I do believe that I will eventually generate ones which perfectly mock your I/O characteristics. If I set up a random-lookup-table-generator and stick it in a nullentropic loop for some insane number of years, will that generate all possible conscious beings? How about the digits of pi - does that work as well?

> What if a bottle of ink were knocked over and just by blind chance it formed
> the lookup table, would it be conscious then?
> I don't know or care because that is a physically impossible situation. It's
> a situation that has never happened and almost certainly will never be seen
> by any observer in the universe, and that's as good a definition of
> "impossible" as you'll find.

I see. And if I told you that the number of actual Universes in the Reality is so large that it could only be expressed by Knuth notation, that *anything* no matter how improbable has happened at least once due to the sheer size of the Cosmic All... would that suddenly change the basic laws that operate on my finite PowerPC? I don't think you can make your philosophy dependent on the absolute size of the Universe; basic laws shouldn't be dependent on the existence, much less the nonexistence, of items not in causal contact.

> What about Moravec's idea that a brain is determined by the relationship of
> its parts and that involves an arbitrary interpretation so all possible minds
> exist?
> I'm an agnostic on the subject. All I know is that it's a tautology (and thus
> true) that intelligent behavior implies intelligence and it's an axiom that
> intelligence implies consciousness. Non intelligent behavior, like that
> produced by a rock, may or may not imply consciousness. My hunch is not.

Tautologies and axioms are non-useful definitions. Tell me how to program a conscious computer.

           Eliezer S. Yudkowsky
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way