Eugene Leitl wrote:
>
> Eliezer S. Yudkowsky writes:
>
> > Easy: There is no meaning of life, no goal system can sustain itself
> > without one, and they all commit suicide. Of course, those three items
>
> Maybe this explains why you always tread the brittle AI path.
What is this, the law of the excluded middle? Just because an AI isn't connectionist doesn't mean that it's classical. Webmind is a crystalline AI. Elisson isn't.
> Real
> life is the ultimate Energizer Bunny: it just keeps going, and going,
> and going without needing any formalized built-in goal.
"Formalized" and "built-in" are mutually exclusive, don't you think? And name me one known brain that has goals but no built-in goals.
I'm always amused by the way that people who insist that all goals are equally "correct" get so disturbed by my goal of saying it isn't so...
> > in sequence form a rather tenuous logic (life could have meaning, our
> > cheapjack human systems function without one, and why is suicide more
> > rational than anything else?), but it's a possibility.
>
> Yes indeed. Commiting suicide is always about choice and values, the
> same thing with going on living.
And the same thing with walking across the room, or any mental action... I don't see what prediction I can make based on your statement.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way