Gordon Worley wrote:
> Maybe it comes down to making this choice: would one rather live
> free yet at the risk of destruction or inside a cage but safe from
> all but universe level destruction. I have chosen the former, but
> maybe Eliezer has chosen the latter.
My Standing Challenge is as follows:
"Name one concrete thing that you should be able to do, but that a Sysop
won't let you do." It can't be an intangible quality, like "having
nanobots fully under my control" - you have to name something specific
that you want to do with those nanobots, but which you won't be able to do
under the Sysop Scenario.
No decrement of freedom is involved.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:40 MDT