Re: Simulated Misery was Re: Merciful Retroactive Abortions

From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Sun Mar 18 2001 - 20:12:25 MST


I made some off-the-cuff comments on misery being necessary for evolution
 
J. R. Molloy responded:

> You assume that misery will occur. Why? Surely beings smart enough to
> self-evolve can figure out how to avoid misery, as another list member has
> already indicated.

You can avoid misery if you disconnect key pathways involving in the
determination that "this life sucks". Now that would be an interesting
simulation where the sim-director receives the feedback that the simulant
feels the sim sucks, but the sumulant doesn't feel it at all.

I'd have to argue that this results in a really screwed up sim. This
has to rapidly decay into a Monty-Pythonesque sketch --
  A: "O.K. I'll just go on now."
  B.K.: "Come back 'ere and fight you bugger"
  A: "But you've got no arm".
  B.K.: "That! Its nothing but a flesh wound". etc.

[I've posted the URL's in a previous msg.]

You can avoid "misery" but can you avoid "simulated" misery?

How do you determine what path to take to a higher level if there
are not, along some paths, some real losers?

You can tip-toe through the tulips and make sure nobody "dies"
(heaven forbid) but then the least level of "the world didn't
turn out the way I wanted" whining becomes the ground zero
standard for "suffering".

STRONG ASSERTION: "Eliminating someone from the game -- e.g.
painlessly erasing a conscious self-aware being -- is less
immoral than continuing to run a conscious self-aware being in
a simulation in which which they *know* they are being given a
less than equivalent share of the computronium pie.

In one case you eliminate the suffering in the other you
prolong it!

Its looking to me like it becomes immoral to create backup
copies and *ever* allow them to run. You can create the
copies but you can't run them forward in a simulation.

QED: if you are going to screw up your life by changing
the dials on your internal prioritization/sensitivity
network you *must* do it without a parachute.

Thus we are back to do you value "consciousness"
or do you value evolution (extropicization)?

[As Anders has pointed out there is probably room in the
Universe for both perspectives -- but to maintain that --
it would seem you have to adopt an
  - "I do not know",
  - "I do not want to know", and
  - "It would be immoral of me to discover that I knew
    ("what" you are doing with your computronium).
perspective.

Mind you -- if I construct my computronium to run with sufficient
encryption, it is unlikely that you could discover that I'm
running a "copies" that bite the dust in reality. So you can safely
proceed to use my computational services unless you want to take
the "moral" high ground that I could not derive the results
I am able to produce unless I was off'ing copies. Kind of sounds
a bit like the debate for/against eating genetically engineered
foods today.]

Robert



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:41 MDT