Re: FAQ Additions (Posthuman mind control)

Eliezer S. Yudkowsky (sentience@pobox.com)
Wed, 24 Feb 1999 15:34:00 -0600

den Otter wrote:
>
> Morals are subjective, but the most practical (=rational) course of action
> would be not to create AIs with SI potential and work on uploading instead.
> Again, our prime directive should _always_ be survival. Survival is the
> prerequisite for _all_ other actions. Any philosophy that does not value
> personal survival [in an optimal state] above everything else is by definition
> irrational. Thus follows that transhumanism (with an immortalist element)
> is the best philosophy currently available to us.

den Otter, according to your philosophy, Newton should have forgotten about all that silly "physics" stuff and pursued alchemy or theology, which (at the time) were widely considered the most plausible courses to immortality. By your standards Gilgamesh behaved more ethically than Socrates.

-- 
        sentience@pobox.com         Eliezer S. Yudkowsky
         http://pobox.com/~sentience/AI_design.temp.html
          http://pobox.com/~sentience/sing_analysis.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.