Re: Ethics of being a Creator

Henri Kluytmans (hkl@stack.nl)
Fri, 24 Apr 1998 17:08:18 +0200


Anders Sandberg wrote:

>That sounds nice, but it might not be easy to do.

Indeed, but I didn't say it would be easy.

>First, some of the entities might suffer a gradual decline (like
>humans with Alzheimer's disease), would it be nice to save backup
>copies of their final, degraded state or should one save copies
>from an earlier state? If so, why just one, why not several?

When the creator has enough power to simulate a universe, he most
also have enough power to merge the backup copies of earlier states
with the later backups of the degraded states, and to construct
from those multiple states a "healthy" state, containing all memories
of the being. Of course one has to define what a "healthy" state is
first.

>Second, it might not even be clear to the creator which entities
>are sentinent and which are not (there might be a lot of
>intelligence hidden in the apparent chaos of type III CAs,
>invisible until you look at the principal fourier components).

OK, this is maybe the most difficult part. Indeed it's possible
that the creator will not be able to detect all sentient beings.
But I'm fairly sure he will be able to detect most of them.

If sentient beings will develop (that are also able to selfreproduce)
it is very likely they will transform their environment considerably.
This process is exponential. (Look at how the human beings have
already transformed our environment (Earth). And how we could likely
transform are whole galaxy within a million years.)
This reasoning presumes that reproducing sentient beings will at one
time develop a technological civilization. I think however that this
is quite likely.

At a point in time the technological civilization will have transformed
its environment a considerable amount, it should then be quite feasible
to detect this pattern in the simulation. Now, the creator only has to
play back the simulation and extract al the sentient beings of the
earlier generations of this civilization.

Of course, sentient beings that will not develop a technical
civilization are much harder to detect.

And how does one determine if a being is sentient enough to be
worth being restored.

>Third, the afterlife is rather underdetermined: how to
>keep the entities from pain *there*?

When the entities are restored in a separate simulation there
is now no reason anymore not to communicate with them. So the
creator could ask the sentient beings themselves if they are happy
or not. Or even give them the capabilities to alter themselves
in to a state they individually prefer.

>Even when the creator has no idea about what sentient beings will
>emerge? If you create a world top-down, in the genesis fashion ("Let
>there be animals, to the following specifications... Let there be
>chemistry to implement the animals, to the following
>specifications...) it is fairly easy to say you are responsible for
>the entities you create (but note that they might have free will,
>which might give them some responsibility themselves).

Indeed.

>The unpredictability of Turing machines and complex systems is IMHO
>the basis for our free will, and puts a limit on the amount of
>responsibility we can place on any creator.

So you're saying that a creator could have a certain degree of
responsibility for the sentient beings created in his simulation.

I'm not saying a creator must be hold responsible, I'm just saying
he could be hold responsible. So then we are saying the same.

>> By the way, at Transvision98 we will show a television documentary
>> which will illustrate exactly this issue: "A creator of a simulation
>> containing sentient beings is being held responsible" :->

>I look forward to it!

Me too.

=======================================================================>
>Hkl ------> Technology & Future at http://www.stack.nl/~hkl
Transcedo --> Dutch Transhumanist Society http://www.dse.nl/~transced
Because the future is where we will spend the rest of our lives ...
You see things and ask "Why?" ; I dream things and ask "Why not?"