----- Original Message -----
From: "J. R. Molloy" <jr@shasta.com>
To: <extropians@extropy.org>
Sent: Wednesday, September 06, 2000 1:23 PM
Subject: Re: Why would AI want to be friendly?
> > Does truly superior intelligence require free will?
> >
> >
> >
> > --
> >
> > ::jason.joel.thompson::
>
> The term "free will" is an oxymoron, a self-contradiction.
> You can't have willfulness and freedom from willfulness simultaneously.
I recognize the point you're making, but it's a different discussion.
Let me steer this away from a conversation over free will by rephrasing the
question, again:
Isn't an essential component of superior intelligence the ability to detect
and route around factors that limit its efficacy?
--::jason.joel.thompson:: ::founder::
www.wildghost.com
This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:22 MDT