Re: Future Technologies of Death

Yak Wax (
Thu, 1 Jan 1998 14:04:37 -0800 (PST)

Or maybe we could all get over the need for revenge, and reduce the
crime rate to *zero*


Nick Bostrom <> wrote:
> Anders Sandberg <> wrote:
> > "Martin H. Pelet" <> writes:
> >
> > > When AI systems will be available, you will of course have the
> > > to read their whole minds directly, which would solve the
problem above,
> > > but this method would violate their rights.
> >
> > Not necessarily. If you start to examine my innards without
> > permission, it would be a violation of my rights. But I can give a
> > doctor at least a temporary permission to examine my internal state,
> > and that does not violate my rights.
> We might be able to do even better. Have a special-purpose machine
> that scans your mind and gives as output a binary answer to whether
> you are responsible or not. Then the memory of this machine is
> automatically erased and only the output remains. The subject to be
> investigated can be allowed to make any arrangements necessary to
> verify the validity and confidentiality of the proceedure.
> Suppose such a machine existed for deciding with 100% accuracy
> whether somebody had commited a certain crime. It would then be hard
> to argue that the law enforcing agency did not have the right to
> apply this technique to any subject it wanted to (provided that it
> was quick and had no side effects). For to refuse the law enforcing
> agency the unlimited right to use this technique would be equivalent
> to refusing it to maximise its detection rate of criminals, even when
> no externalities were involved. What legitimate reason could anybody
> have for not wanting the law enforcing agency scan his mind with this
> device in order to find out if he had commited a crime?
> Or do we perhaps *want* a certain failure rate of the law enforcing
> system, so that if the political systems lead to really screwed up
> laws, there will at least be some chance of escaping them and perhaps
> starting a revolution. Is criminality a little bit like random noice,
> so that a certain amount is beneficial since it makes the system less
> likely to get stuck in a "local maximum"?
> ________________________________________________
> Nick Bostrom
> *Visit my transhumanist web site*

Get your free address at