Re: Goal-based AI

Eliezer Yudkowsky (sentience@pobox.com)
Tue, 31 Dec 1996 15:00:24 -0600


> Eli writes:
> >I say, program 'em right, give 'em ethics instead of emotion, and
> >let 'em loose.
>
> Um, excuse me. This presumes a distinction between ethics and emotion
> that has yet to be adequately explained.
>
> I hope you're not going to reply to this with more of your usual
> capitalized phrases like "the Meaning of Life" (as if there were only
> one) or "the Purpose of the Universe" (again, as if there were only
> one).

Nope. I figure that if, after God knows how many kilobytes of cognitive
science and clear, physically based, source-code-available definitions
of the precise and exact cognitive difference between a self-justifying
goal system and an evolved set of priorities, including multiple posts
and a page on the Web, repeating myself isn't going to help.

Also, I have never used the phrase "Purpose of the Universe"; then I'd
have to justify (a) the existence of God and (b) mapping our
goal-oriented cognitive architectures onto him.

As the great Dogbert once said (paraphrasing):
"The year 2000 is coming. God uses a decimal counting system and He
likes round numbers."

-- 
         sentience@pobox.com      Eliezer S. Yudkowsky
          http://tezcat.com/~eliezer/singularity.html
           http://tezcat.com/~eliezer/algernon.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.