A number of folks are understandably squeamish about running (non-cruel)
simulations of others, e.g.,
Eliezer S. Yudkowsky wrote:
> Would you like to find out that you, yourself, are
> simply a modeled intelligence in someone's imagination?
> That you therefore have no citizenship rights?
and Robert J. Bradbury wrote
> In short -- what do you do in the simulation where you get to
> the point where you cannot explore any further without 'offing'
> someone against their will?
and finally samantha wrote:
> Once you have turned the corner and created self-aware
> intelligent beings I don't think you can morally any longer
> claim they exist only for your own purposes.
The flaw here is that running someone is not an either/or
proposition. There is always a question of, for example,
the speed at which the simulation is run. If a certain
simulation that you've created is no longer suitable for
your purposes, you might merely assign resources such that
they get a second of run time every thousand years, then
every two thousand years, then every three thousand years,
etc., a series which does not converge, i.e., it still
bestows on your creations unlimited run time.
Ethically, there is nothing wrong with making a simulation
containing emulations of conscious beings, and then running
it at whatever speed you want. Or shutting it down. The
only moral prohibition is, simply put, "Don't ever be cruel".
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:41 MDT