From: Eliezer S. Yudkowsky (email@example.com)
Date: Thu Feb 14 2002 - 21:03:28 MST
Zero Powers wrote:
> >From: "Eliezer S. Yudkowsky" <firstname.lastname@example.org>
> >Zero Powers wrote:
> > >
> > > The general consensus seems to be that AI
> > > won't be AI until it is sentient. But, so long as it is capable of
> > > autonomically meeting my needs, I don't want or need it to be sentient.
> > > Personally I'd rather have an AI smart enough to do all the tasks I
> > > to it yet dumb enough not to know its smarter than me.
> >Ya ain't never gonna get it. What you just asked for requires general
> >intelligence and self-awareness.
> Admittedly none of the above services are as impressive as, say, reviving me
> from a vitrified state or imposing world peace or eliminating the social,
> economic and geo-political problems of the world. But then any system that
> *could* do that would be in control, and you and I would be at its complete
Oh. I thought you meant AI smart enough to do any task you could assign
to it. I didn't realize that you were planning to ask so little. Well,
at any rate, as is sort of implied by your message, there are those of use
who plan to ask for more from our AI. Every other corporation in the
world can decide to build better voice-recognition systems, if they like.
It just means that those particular corporations won't be very relevant to
how the fate of humanity plays itself out.
No matter how enormous the market is for a better voice-recognition system
or a car that drives itself, it doesn't mean that the Other Kind of AI
will have any less earthshaking an effect, even if it's just a few
projects working on it. It's true AI that's relevant, and it doesn't
become any less relevant because something that isn't real AI becomes a
big hit commercially. There is not a limited amount of AI-stuff so that a
big hit in modality-level AI can drown out generally intelligent AI. A
true AI has the same amount of earthshaking power whether or not there's a
big market for self-driving cars.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:39 MST