Re: FAQ Additions (Posthuman mind control)

den Otter (neosapient@geocities.com)
Thu, 25 Feb 1999 00:46:41 +0100



> From: Eliezer S. Yudkowsky <sentience@pobox.com>

> den Otter wrote:
> >
> > Morals are subjective, but the most practical (=rational) course of action
> > would be not to create AIs with SI potential and work on uploading instead.
> > Again, our prime directive should _always_ be survival. Survival is the
> > prerequisite for _all_ other actions. Any philosophy that does not value
> > personal survival [in an optimal state] above everything else is by definition
> > irrational. Thus follows that transhumanism (with an immortalist element)
> > is the best philosophy currently available to us.
>
> den Otter, according to your philosophy, Newton should have forgotten
> about all that silly "physics" stuff and pursued alchemy or theology,
> which (at the time) were widely considered the most plausible courses to
> immortality.

Well, first of all this philosophy is meant for the present and future, and not the (distant) past. People simply didn't have the means back then to make truly rational choices and more importantly, to save themselves. It would only be frustrating to be an atheistic immortalist back then, though had it been more widespread we'd probably be well past the Singularity by now. Had Newton followed my philosphy, he would probably still have done good scientific work, only aimed at life extension instead of "general" physics. But this is irrelevant, as the philosophy is for the 20th century and up.

Anyway, you seem to disagree that survival is the prerequisite for all other actions, and thus a rational prime directive, or is it something else? Any goal, no matter how grand it is, is useless [to you] once you're dead. You won't be watching from afar and enjoying your work, as many seem to (subconsciously) think.

> By your standards Gilgamesh behaved more ethically than
> Socrates.

A somewhat strange comparison, IMO. What do you exactly mean here?