Re: Sysop vs. Liberty (was Re: How To Live In A Simulation)

From: Eliezer S. Yudkowsky (
Date: Thu Mar 15 2001 - 23:37:54 MST

"Robert J. Bradbury" wrote:
> How about manufacturing my *conscious* girlfriend-de-jour with an
> auto-self-disassemble program on a 24-hour timer (because all
> that sword-fighting and head lopping off is just too much trouble).
> Built in is the memory that she gave informed consent to
> auto-self-disassembly.

Probably child abuse, but I'm not sure. It looks to me like this is
blazingly evil, but there might be some loophole whereby this is moral and
acceptable. Or it could even be that the above scenario is entirely
unproblematic. Maybe, somewhere on Earth, is a woman who, in the due
course of time and natural personality growth, without coercion or false
beliefs, has arrived at the conclusion that she'd have no problem with
being duplicated, modified, used, and exterminated as long as it made
someone else a little happy. If so, you can just borrow a copy of her,
modify to suit, erase at leisure. And there'd be nothing actually morally
wrong about it, however unutterably horrifying it might be to
second-millennium folk.

Emlyn wrote:
> We can't answer this question, Eliezer. We don't have any idea what the
> Sysop wont let us do. Do you?

It's not a question of what the Sysop will or won't let us do, but the
degree to which you expect that the Sysop can make these decisions

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:40 MDT