Wei Dai, <weidai@eskimo.com>, writes:
> On Tue, Jun 15, 1999 at 06:01:46PM -0500, Eliezer S. Yudkowsky wrote:
> > Of course not. If we can't comprehend the First Cause or qualia
> > ourselves, because our reasoning processes deal only with
> > Turing-computable ontologies, I would hardly expect us to be able to
> > explain qualia to a skeptical Turing-computable being! Remember, qualia
> > are enormously improbable; the only reason we're allowed to have them is
> > because of the enormous number of races combined with the Anthropic
> > Principle. To any skeptic, the probability that our race is
> > congenitally brain-damaged would exceed the probability of our having
> > actual qualia. You'd have to open up the neurons and demonstrate that
> > we're messing with Weird Physics, after which the whole qualia business
> > would be more plausible.
>
> Can you go through your reasons for believing that qualia are enormously
> improbable? I understand it has something to do with the seeming
> impossibility of finding a definition for "instantiation" of a
> computation, but what is the connection exactly?
I believe Eliezer's logic is that if it is impossible to define whether a computation is instantiated, then there is no "fact of the matter" as to whether any given computation is instantiated in any given system. But he takes as given, based on his personal experience, that it is a definite fact that his qualia exist. It follows that qualia cannot result merely from computation.
Now I am speculating a bit, but I believe that Eliezer's position is that qualia are not a crucial and necessary aspect of the mental structure of a successful organism. It should be entirely possible for intelligent beings to evolve and succeed without qualia, because all that is really necessary is the ability to model the environment, extrapolate possible events, come up with plans, and so on, all of which seem to be computational tasks (hence not requiring qualia).
If this is the case, there is no reason for qualia to evolve. Given that we (or at least Eliezer!) have qualia, this must be what Gould calls a "spandrel", a sort of evolutionary by-product which has no functional or adaptive significance. Since qualia are not selected for, they probably do not normally appear as the result of the evolutionary process, and so most intelligent races would not have them. As Eliezer says above, the only reason we have qualia is because the universe is so large that some few intelligent races have ended up with them by happenstance, and then he invokes the anthropic principle to say that we must be one of that minority, since we are conscious.
> Also, do you think it is possible to program a Turing-computable being to
> behave as if it has qualia even though it doesn't really? If so how can we
> tell we are not such beings? I'm thinking "I have qualia" but that is what
> such a being would be thinking as well. Conversely, how do we know it's
> not the case that non-sentient animals also have qualia, but they just
> can't communicate that fact to us?
Based on Eliezer's model, it would seem impossible to faithfully replicate all of the characteristics of qualia in a computational being. Although he might be able to approximate or mimic the behavior of a being who actually had qualia, there would inevitably be some subtle differences which could in principle be detected (although it might take a super-Turing being to tell the difference?).
> When you talk about non-Turing-computable "Weird Physics" are you thinking
> of physics that would be computable under a more powerful model of
> computation (for example TM with various oracles), or something entirely
> different?
I think to answer this you have to go back to his basic argument regarding the vagueness of instantiation. If you accept that, then augmenting the TM with various oracles would still seem to leave the problem of indeterminacy of when such an augmented TM is instantiated. Now, if the qualia are somehow in the oracle itself, so that it doesn't matter what kind of TM is attached to it, then this would be strong enough to allow for qualia. But if the existence of qualia depends on a particular program being run in the TM, in addition to any functionality in the oracle, then the difficulties with instantiating that TM crop up, so this would not be strong enough.
Hal