Eliezer S. Yudkowsky wrote:
> I suggest that the section:
>
> ==
> In the first case, we could make sure that the values of tolerance and
> respect for human well-being are incorporated as core elements of the
> programming, making part of an inviolable moral code..
> ==
>
> Be amended to read:
>
> ==
> In the first case, we could make sure that the values of tolerance and
> respect for human well-being are incorporated as core elements of the
> programming, making part of an inviolable moral code. (However, some
> think this would be a hideous mistake from a <a
> href="http://tezcat.com/~eliezer/AI_design.temp.html#PrimeDire
> ctive>programming</a>
> perspective; some also question the morality of such an action.)
The responses so far appear to be generally unfavorable.
I see two fatal criticisms of the idea suggested in the FAQ:
First, it is mind control. Remember, posthumans are by definition fully
sentient beings. Programming them to abide by a preordained moral code is
no different than doing the same thing to our own children, or to each
other. I can see no possible way to justify such an action on moral
grounds.
Billy Brown, MCSE+I
bbrown@conemsco.com