Jim Fehlinger wrote:
> 
> The thing is, forever is such a long time!  What if the Sysop
> mutates in some unforeseen way in this abyss of time, and
> requires a mid-Eternity course correction?  The Sysop will always
> be more rigid in one respect than the clients -- it will continue
> to have to serve as their protector and mediator throughout all
> time, unless of course it is programmed to cease operating after
> all the clients have reached a certain level of development (or
> even commit suicide, to ensure that **it** is not a threat to
> the clients).
Programmed?  No, not programmed.  A Sysop is a Friendly superintelligence
that has decided Friendliness is best served by a Sysop Scenario.  Under
pressures sufficient to break the Sysop Scenario - which really would
require an Elysium-type catastrophe, something totally implausible and
Eganic - the Sysop stops being a Sysop and starts being a Friendly
superintelligence that has now rejected the Sysop Scenario as failing to
maximize Friendliness.  And you can leave in your Eganic lifeboat,
Sysop-free, if you wish; perhaps torture and child abuse will re-enter the
Universe, but by hypothesis, the alternative was total extinction.
And if the Sysop Scenario turns out to be unnecessary in the first place,
I'd expect a Friendly superintelligence to spot that as well - or rather,
simply never arrive at the Sysop Scenario as a conclusion.  The Sysop
Scenario is my prediction because it is a consequence of Friendliness
given my model of reality; the Sysop Scenario is *not* intrinsic to
Friendliness.
> This smells like the sort of thing that
> eventually sent HAL round the bend.  What if the Sysop starts
> to get a little dotty?  Egan imagined something unforeseen like
> this happening in Elysium, when the Lambertians got smart
> enough to TOE the Elysians out of existence.
> 
> So here are some concrete things that you might want to do that a
> Sysop might or might not permit.  Could you gain direct control
> of your own computronium (and go hang yourself with it, if you
> wanted) by forking off your own universe and leaving the Sysop
> and the other clients irrevocably behind?  Could you and a
> friend go through the same escape hatch together?  Could the clients
> unanimously decide to leave together and ditch the Sysop entity?
--              --              --              --              -- 
Eliezer S. Yudkowsky                          http://singinst.org/ 
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:40 MDT