Re: Future Technologies of Death

Anders Sandberg (
30 Dec 1997 14:27:27 +0100

"Nick Bostrom" <> writes:

> When you say "Nobody will feel hurt, so what's the problem?", does
> that mean that you are a hedonist? If so, then the optimal
> organization of the world might be one where there is some amount of
> emotionless AI, and the rest of the available matter is transformed
> into pleasure-maximal structures -- say simulations of
> human brains having orgasms, but highly optimized and without
> unecessary cognitive functions.

"Pleasure-maximal structures" - I love that term. I have this vision
of immense fractal megastructures forming entire dyson spheres, all
built for experiencing total bliss (not necessarily by being
passive?). A bit like the "Infinite Fun Space" that the Minds of Iain
Bank's Culture novels spend much time in.

> > Or I could only create a clone of myself without the moustache, leaving
> > the moustached original enslaved to a tuft of intelligent hair. How intelligent
> > do we want the [sub-]entities to be to grant them such rights? )
> Interesting question. I think that Anders has the rule of thumb that
> an entity has rights if and only if it can demand them. That rule
> has some repugnant consequences though. It seems to imply that we
> could torture animals and butch humans who are retarded. (Perhaps the
> rule is only supposed to state a sufficient condition for having
> rights, not a necessary one?)

Yes, I tend to regard it as sufficient, but not necessary.

Recently there have been some debate here in sweden about the rights
of animals (especially after a small scandal where Peter Singer was
not allowed to speak at a conference due to his views on retarded
children (!)), which got me thinking about how we treat other entities
and what rights we give them or not. I would say that there is nothing
wrong with extending rights to other entities that cannot demand these
rights or understand them, but it is *we* who do this, there is no
natural ethics which says that certain structures should have rights
and others should not. This of course means we will draw a more or
less arbitrary line based on our own ethical system, say determined by
awareness, but there is nothing wrong with this. We cannot give every
possible system rights (is it unethical for me to eat a carrot? It is
also alive and weakly information-processing. What about the rights of
flames?), and we shouldn't give just the systems that demand rights

Of course, this doesn't solve the moustache problem. If it is a
rational moustache, it might be convinced to make a deal with its
host, but I don't know what to do with irrational moustaches.

Anders Sandberg                                      Towards Ascension!                  
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y