From: Damien R. Sullivan (email@example.com)
Date: Sat Feb 16 2002 - 22:03:59 MST
On Thu, Feb 14, 2002 at 11:23:50AM -0500, Eliezer S. Yudkowsky wrote:
> Zero Powers wrote:
> > Personally I'd rather have an AI smart enough to do all the tasks I assign
> > to it yet dumb enough not to know its smarter than me.
> Ya ain't never gonna get it. What you just asked for requires general
> intelligence and self-awareness.
Yeah. But just because it knows it's smarter than you doesn't mean it
has to care.
The evolutionary core of humans is a selfish geneset trying to reproduce.
The evolutionary core of AIs is a command prompt or interrupt loop
waiting for something to do. They're not going to have senses of
dignity or boredom unless given such senses.
-xx- Damien X-)
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:39 MST