On 3/13/01 8:39 PM, "Brent Allsop" <email@example.com> wrote:
> If a qualia theory is true, "zombie" or pure abstract
> simulations would be possible. In a "zombie" simulation everything is
> represented by some cause and effect mechanics, there would just be no
> actual phenomenal consciousness and hence no suffering or joy. Just
> as there is no joy of the experience of color in the representation of
> color inside a paint detecting machine.
A paint detecting machine is not intelligent, and therefore is not a valid
example. How would you know if an intelligent machine with a paint detector
could experience phenomenal consciousness?
One bias that seems to pervade your thought is that human "phenomenal
consciousness" as the only valid instantiation, which at the very least
would appear to be a rather gross assumption. Intelligent machines by their
nature would use very different internal representations of reality than
people. Who is to say that a machine doesn't have the same phenomenal
experience from a sequence of bits in its detectors, something you has a
human find to be very mechanical? Even today, if you look at sophisticated
processors you will find that rather innocuous and simple data streams can
produce a multitude of very distinctive and complicated states within the
processor. Some of these states can momentarily change the qualitative
nature of the processing and the context of incoming data. How is this
substantially different, other than being a primitive yet extraordinarily
complicated bit of finite state machinery?
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:40 MDT