Re: No AI for Nano/No Nano for copyloads

From: Peter C. McCluskey (
Date: Thu Jul 20 2000 - 10:21:58 MDT (Robin Hanson) writes:
>Hal Finney responded:
>>Paperwork and lobbying won't necessarily do the job if people are morally
>>opposed to the project. Look at the genetically modified food issue
>>we have been discussing.
>Yes, but is it so clear what the directions of people's "moral" dispositions
>are in this case? To march against violation of cryonics patients, people
>would have to accept cryonics patients as being alive, and therefor that they
>have been killing most of the elderly humanity for many decades.

 Here's an argument I won't be surprised to see demagogues make:

     We don't know yet whether it is possible to truly revive cryonics
   patients. We need to do more research before doing anything irreversable
   to them. It is cruel to create an uploaded being that contains some of
   those cryonics patients' humanity, but is missing a soul [or qualia,
   adequate sensory input, some possibly important memories, the vital force,

 If the value of human labor could lose trillions of dollars, it is hard
to see what could counter that motivation if it were happening today.

> To march
>against upload working conditions, people would have to accept machines as
>conscious, as well as think of upload VR palaces as miserable places. There
>seems to be far more basis to let people think of uploads as inhuman as there
>ever was to think of black slaves as inhuman.

 I.e. it would probably end up like what the animal rights movement would
be if animals were qualified to do many human jobs. I'd guess that if animals
suddenly became able to replace 30 or 40 percent of human labor, that the
animal rights movement would succeed in creating enough moral revulsion to
animal labor to outlaw virtually all animal labor worldwide.
 I can't figure out how long such a prohibition would remain enforceable.

>I wasn't talking about secrecy, just keeping it out of people's faces.
>People seem to accept "taking advantage" of others in all sorts of ways as
>long as it isn't vivid to them. People give to panhandlers and say no to
>mailed charity letters. People happily buy products from foreigners working
>in conditions they would be outraged to see their neighbors working in.

 If many other people are already buying those foreign products, each
additional consumer realizes that he can't have much effect on the
conditions those foreigners are working under and won't be ostracized
for making purchases that differ from his peers. Consumers who are
thinking of being among the first to buy a controversial product have
more trouble rationalizing away moral concerns.

>>Plus, what are the odds that the destructive upload is going to work,
>>first time? Most things don't. A number of people are going to be
>>killed before you get an upload that works. The moral opposition to
>>such an effort would be overwhelming, especially when it comes from
>>people who will be put out of work if it succeeds.
>Again to get upset about the destruction of a frozen head you had to
>accept it as alive in the first place. And the risk of a destructive

 I.e. you are claiming that people would never get upset over things
like digging up graves?

>scan failing to pick up the relevant info is different from the risk
>of any one simluation not being tuned right.

 It should be easy to create controversy over what info is relevant. (Robin Hanson) writes:
>Robert Bradbury responded:
>>All you need is the greens convincing people that "services
>>provided by *real* people" are better" and you are in a situation
>>where the market demand for upload slave labor goes soft.
>This is crazy. Why don't people solve other problems by just
>convincing everyone to prefer third world labor, rain forest labor,
>disabled labor, animal labor, or "socialist" labor?

 Those particular examples are weak because it is doubtfull whether there
is much motivation for those rules.
 I look at movements to repudiate slavery, child labor, human cloning,
and genetically modified foods and see a much more mixed picture than you do.
 I expect that important factors that affecting the enforceability of such
rules are how easy it is to observe violations of it and how deep the
pockets of the typical violator are. The first upload attempts will make
big headlines unless unusual efforts are made to keep it secret. It will
be much harder than with racism or child labor for a company that creates
the first upload to confuse consumers by denying any violation or claiming
it was unintentional.
 For these reasons, I expect that social pressure will delay uploading,
although I can't predict by how much. It seems likely that advances in
AI, even if they haven't come close to the Turing-test level, will
replace steadily increasing fractions of human labor in ways that will
make it obvious that preventing uploads will not protect human wages
 And I hope that people will realize that a powerfull AI that has been
created without human instincts will be more dangerous to humans than an
upload which retains most human instincts, and that it would be safer
to have the power of those AIs balanced by comparable powererfull uploads.

Peter McCluskey          | The US Idea Futures Exchange: speculate on | political,financial issues at

This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:34:56 MDT