Re: Eliezer S. Yudkowsky' whopper

From: Spudboy100@aol.com
Date: Tue Oct 03 2000 - 19:54:24 MDT


In a message dated 10/3/2000 7:33:17 PM Eastern Daylight Time, jr@shasta.com
writes:

<< No (but yes, I do care), my "premise" is that anti-AI is the threat that
extropy
 will one way or another overpower. Intelligence, whether augmented,
artificially
 evolved, generated in a vat, or distributed via the Net, shall displace and
 supersede egoistic concerns and identity models.
 
 What do you think?
 (Don't hold back. Let's hear what you really think. Come on... how do you
think
 things will turn out?) >>
I think that technological predicitions for the last 20 years have erred on
the side over-optimism, in the same sense that we'd have colonies in space by
the year 2000 and moving sidewalks. Following that conjecture the scientific
advancements will continue, but never as quickly as desired by the
futurists/transhumanists/extropians. I think that will allow time to adjust
to new technological change and A.I. to become kinder, gerntler, and prosaic.
We are the A.I. because it will need us to feel and things and so
forth-because only a human can-right now.



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:15 MDT