Re: How To Live In A Simulation

From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Thu Mar 15 2001 - 13:02:50 MST


Reliberion's Prince of Darkness (aka Eliezer wrote):

> If this is a simulation, I don't expect that I have much free will.

Of course you have free will, what would be the point of a simulation
if you didn't? [Now of course, if you are a zombie in the simulation
in which *I* have the free will, then you don't...]

> I tend to deprecate the "amoral posthumans simulating whole civilizations"
> hypothesis - (a) I don't think it's true,

Oh, well, that argument wins in my book! Let me write that down,
I'll have to use it more often. :-;

> (b) our civilization currently seems to be on track for either
> extermination or Friendliness (neither future allows the amoral
> simulation of whole civilizations),

What makes you say that? We are back to the fundmental problem
what you can do with sub-SI's. The Aristoi are a good example
in which the elite SI's can do pretty much whatever they want
with sub-SI's. They generally treat them benevolently because
that is their self-imposed code, it isn't built into the 'hardware'
however (and they can reprogram the hardware).

As was pointed out, you don't need to simulate an entire civilization
you only need to simulate those parts of it where you want to allow
degrees of freedom. As I sit here, I have no indication that Eliezer
or Spike or Greg or Max even exist. They simply get created on
demand by the Blue People when I need to interact with them.

> and (c) if it is true, there's not much I can do about it.

Pshawww... You could do something like what Allen Tough is doing with
his "Invitation to ETI" (http://members.aol.com/WelcomeETI/hello.html).

While I'm not very optimistic about its probability of success, as
has been discussed in the SETI community for years, if we "do nothing"
our chance of success is zero (or nearly zero).

The question becomes to determine the motivations the SI's have
in running the simulation and determine what could motivate them
to end it via "uplifting" us (presumably our desired outcome).

>
> The Sysop Citizenship Rules, I expect, disallow simulation of single
> unconsenting individuals even as personally experienced virtual realities
> - never mind the simulation and extermination of entire civilizations!

How do you even know or enforce what someone else does with their
computronium? Its the point I made on nanodot about attempting
to regulate nanotechnology -- you can't verify it because you
have to disassemble everything to the atomic level to determine
there isn't any illegal tech present. Any rules have to be voluntary
and any voluntary rules that are universally adopted require that
each of us have access to the same complete set of logical data
(and experiences) from which those conclusions are derived. I'm
not sure how much data this would be, but it sounds like I have
to give up the individuality of being "me", so I can adopt the
universal data set that creates self-enforced sysop citizenship rules.
I'm unconvinced currently that I'd be willing to make that transition.

I'll just take myself as is, go to a nearby brown dwarf and transform
it into Real-World Simulation Land where carnage, suffering and death
(virtual of course) rule the day. It isn't *real* carnage, suffering
and death, its only *virtual*. Where is the harm in that? Yes,
you can argue that the people in the simulation don't know its
virtual, but their opinions don't count -- they aren't real either.

> I know (or rather, I recall) that during most of my life I would not have
> consented to remaining unknowing in the simulation [snip]

It didn't happen "really", you were created as-is yesterday with
the memories that it happened because those memories provide
the ethical perspective that drives you to be doing the work you
are now doing. Since that may produce some good in the future,
instilling a historical framework that drives the simulation forward
is a necessary condition.

> But, again, I think this *is* the real world.

Argue your limitations and they are yours... (R. Bach, Illusions).

Robert



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:40 MDT