Re: Eliezer S. Yudkowsky' whopper

From: J. R. Molloy (jr@shasta.com)
Date: Tue Oct 03 2000 - 17:30:11 MDT


Oh, Okay.
So, you want to know...
" ...would you care to state your opinion, once more, on A.I. and if its a
threat, or is that not your premise?"

No (but yes, I do care), my "premise" is that anti-AI is the threat that extropy
will one way or another overpower. Intelligence, whether augmented, artificially
evolved, generated in a vat, or distributed via the Net, shall displace and
supersede egoistic concerns and identity models.

What do you think?
(Don't hold back. Let's hear what you really think. Come on... how do you think
things will turn out?)

--J. R.

"In nature there are neither rewards nor punishments; there are consequences."
Robert Ingersoll

----- Original Message -----
From: <Spudboy100@aol.com>
To: <extropians@extropy.org>
Sent: Tuesday, October 03, 2000 3:28 PM
Subject: Re: Eliezer S. Yudkowsky' whopper

> In a message dated 10/3/2000 3:33:47 PM Eastern Daylight Time, jr@shasta.com
> writes:
>
> << Who are you asking?
>
> --J. R. >>
> You
>



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:15 MDT