Re: Attacks (was Re: Why would AI want to be friendly?)

From: Eugene Leitl (eugene.leitl@lrz.uni-muenchen.de)
Date: Mon Oct 02 2000 - 01:01:48 MDT


Samantha Atkins writes:

> Is it a more acceptable violation of individual freedom to kill someone
> outright because they are dangerous to others? Is it more acceptable to
> force people to change regardless of what sort of world they prefer to

Come, let's go hang out with my droogies at the Korova milk bar. We'll
have a nice frontal lobotomy afterwards.

> live in? Is it more acceptable to impose an ultra-hitech future on
> those simply in no way able to deal with it? I can see that it is not
> palatable to put people into a VR involuntarily. But that violation may
> be smaller than the larger violations that would occur otherwise. It is
> that possibility I want to raise.

Indeed, if the powerful fraction has advanced sufficiently that they
need to restructure the Earth, would you rather die than be translated
into characters in an artificial reality environment?

I don't think the violation is extreme, if continuity is preserved,
and if people are free to leave the "home sweet home" sandbox any time.
 
> How is the criminal not a threat to anyone? What if I don't want to be
> fragged even if the friendly SI will resurrect me instantaneously?

Huh? You can generate a private artificial reality for every user. If
someone feels comfortable machine-gunning down random pedestrians, he
can do that, on dummies, forever. As long as no one gets hurt...

> There is plent of need to discourage criminal abuses regardless of
> whether there is an SI. Or do you want the human race to go totally
> infantile where nothing is real, nothing is at stake and nothing can
> really be changed at all?
 
It doesn't require an SI to recognize that our current makeup requires
constant challenges (boy, I wish it wasn't so, these 2000 cal gym
sessions are sure boring).
 
> I don't propose to "target" anyone. I simply am floating the idea that
> the ultimate in freedom with infinite room to grow is to be able to live
> within whatever world-constraints one most wishes and see how that is.
 
If I was an SI, I would probably leave everything as is, only make
death a transition into the next world. (Of course, those who have
been tortured to death probably *will* complain...)

> What do you think humans will do exactly once your sort of SI is a
> reality?

Die, of course, as the atoms in their bodies are being
absorbed. Because only a god can create such a strange thing as a
SysOp. There is no traversible path from the foothills to Olymp
mountaintop, at least no one which people and their handiwork can go.



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:14 MDT