On 17 Mar 2001, Anders Sandberg wrote:
> When two beings come into conflict over this (say, Robert and I both want
> Jupiter for our respective M-brains) they could of course use force to
> settle it, but if I did this it would lead to the decrease of extropy
> for Robert (however odd I might find his version of extropy) [snip]
Hey -- we both know I've got the "true" version of extropianism,
in my version we only turn the "virtual" philosophers into sausages.
> One can develop the above argument into a more watertight ethical
> system, but it seems to point at a situation where extropianism would
> not lead to everybody against everybody trying to get the last scrap
> of computronium, but rather everybody trying to work together to find
> a way out of the box or at least trade the limited resources.
Ooooh, this is good. This means that when Eliezer's SysOp finds out
it doesn't have the resources to solve a particularly thorny ethical
dilemma, it comes to me (where in my MBrain, we can wipe the slate
clean of all the cycle consuming ancestor simulations) so we can
dedicate the computronium to a "real" problem. It can't solve it
itself without violating its "Self-Imposed Laws of Friendly AIs",
it can however get me to do its dirty work... Something for everyone!
Thats just great!
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:41 MDT