John Clark wrote:
> Eliezer S. Yudkowsky <firstname.lastname@example.org> Wrote:
> >it's probably necessary on a low level for some computations, and
> >on a high level for qualia, but not intrinsically necessary to intelligence
> That does not compute. We know qualia exists, or rather, I know that
> at least one example of it does, it must have come about somehow.
> If the weird mysterious ineffable stuff was not necessary for
> intelligence then Evolution would have no way of producing it.
I tend to assume that qualia started out as a spandrel (like bug-catchers becoming wings), then got tied in to reflectivity or the "system bus" that ties the senses together. The mysterious ineffable stuff was probably just a computational speedup - like Penrose's quantum computing, for example.
> We may value our internal states but to Evolution it's a byproduct,
> behavior is the only thing that's important. Even if we were insanely
> lucky and we got this weird mysterious ineffable stuff by pure
> chance genetic drift would soon make it vanish, and anything
> else that's irrelevant to survival too, just like the eyes of fish that lived
> in dark caves for thousands of generations.
I agree. Obviously, both the ineffable computational shortcut and qualia-based sensory integration (or reflectivity or whatever) either promote fitness, or are the hard-to-get-rid-of byproducts of an adaptation that does.
> Your only way out is to argue that it just provides a short cut to
> intelligence that could be produced in other more complicated ways,
Less complicated, but also less efficient.
> but that would mean it's easier to make a AI that experiences qualia than
> one that doesn't.
No; it would mean that, all else being equal, an ineffable AI is smarter or more efficient than a computational one. It doesn't mean you can't get equally good or better improvements with more computational power or better programming.
> I don't see any need for weird mysterious ineffable stuff.
Need for, no; benefit from, yes.
-- email@example.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way