Re: Protean Self-Transformation

Lee Daniel Crocker (lcrocker@calweb.com)
Mon, 31 Mar 1997 17:30:07 -0800 (PST)


> A computer cannot say something feels good in the way I can say such. It
> can analyze data inputted to it and come to the purely abstract
> conclusion that according to all its data this incoming data should be
> considered a "good" feeling. But it is not feeling the data. It is just
> running subroutines and more subroutines. All the computer has done is
> label the data. It has given the data a name and that is it. The
> computer can then be programmed to blow bells and whistles but the
> computer will in no way ENJOY blowing bells and whistles. It just does
> it. There is no way I could scare my computer, there is no way I could
> anger my computer, there is no way I could embarass my computer, there
> is no way I can give my computer pleasure or pain, no matter what data I
> send it or in what fashion I handle it. How is this so difficult to
> understand?

In what objectively demonstrable way does what you have just described
not apply word-for-word to a human brain? I may concede that /your/
computer on your desk now might not experience fear or joy, but to
blindly assume that such experiences /cannot/ be experienced by a more
powerful computer in /exactly/ the way I do is to postulate that my
brain is something other than the mechanical process that occurs in it--
in other words, that's tantamount to dualism, and other mystical bunk.

I know that I feel. I experience it directly. It would be outrageously
arrogant of me to assume that my particular bizarre construction of
proteins was somehow more capable of that than a similarly-powered
organism implemented in semiconductors.

-- 
Lee Daniel Crocker <lee@piclab.com>  <http://www.piclab.com/lcrocker.html>
"All inventions or works of authorship original to me, herein and past,
are placed irrevocably in the public domain, and may be used or modified
for any purpose, without permission, attribution, or notification."--LDC