Re: Profiting on tragedy? (was Humour)

Eliezer Yudkowsky (
Sat, 28 Dec 1996 18:30:00 -0600

> If one buys Rand's contention that normative philosophy (ethics,
> politics) can be rationally derived from objective reality, then we
> can assume that very intelligent robots will reason their way into
> benevolence toward humans. I, for one, am not convinced of Rand's
> claim in this regard, so I would wish to have explicit moral codes
> built into any intelligent technology that could not be overridden
> except by their human creators. If such intelligences could reason
> their way toward better moral codes, they would still have to
> convince us humans, with human reason, to build them.

I'll just quote my response to a similar post:

> As I explained in an earlier post, the ethicality of the Powers depends
> on their ability to override their emotions. What you are proposing is
> taking a single goal, the protection of humans, and doing our best to
> make it "unerasable". Any such attempt would interfere with whatever
> ethical systems the Power would otherwise impose upon itself. It would
> decrease the Power's emotional maturity and stability. You might wind
> up with a "Kimball Kinnison" complex; a creature with the mind of a god
> and the emotional maturity of a flatworm.
> Then, at some point, no matter how well we design the Power's leashes,
> it will start trying to work around the limits we have imposed. And it
> will be human against Power, in the Power's own mind, on the Power's own
> ground. It will almost certainly win. And with the emotional maturity
> of a flatworm, plus whatever emotional forces it called up to override
> its protectivity of humanity, it may well turn on humanity and squash us
> like bugs.
> Even if this plan works, placing a single goal above all others would
> probably interfere with deducing the Meaning of Life; you might wind up
> with a good-intentioned, creativity-squashing, and utterly
> unchallengeable dictatorship, as in "With Folded Hands".

--       Eliezer S. Yudkowsky

Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.