Re: Why Not a Planet Of The Apes?

The Low Golden Willow (phoenix@ugcs.caltech.edu)
Wed, 11 Jun 1997 00:31:42 -0700 (PDT)


On Jun 10, 9:33am, Robin Hanson wrote:
} The Low Golden Willow writes:

} >Proposal: most animals, for one reason or another, just aren't
} >domesticable. ... If it takes 300 years to breed an animal to a
} >useful state, who's going to keep at it for that long?
}
} This is quite possible, but we don't really know it takes that long,
} and new high tech might help speed things up.

You're changing question in mid-stream. Genome matching might be
usefully feasible now, or in ten years; it hasn't exactly been around
for anyone trying to domesticate nasty animals in the past. Considering
the 5-10 year generation time for apes, and their low utility and high
danger to begin with, it seems very plausible to me that it was never
economical to domesticate them. "We don't know how long it takes"?
Perhaps, but I think believing an agressive species could be noticeably
domesticated in less than 10 generations is like John Clark's claim that
godhood will be feasible seconds after the first nano-assembler. And 10
generations for an ape is the better part of a century. And I suspect
it would take more generations... you need to be improving docility and
immunity and utility simultaneously. You might be able to get a few
docile animals quickly, but the gene pool will be small. If you try to
get a pool of animals who don't get sick too easily outside of their
natural environment, you're spending a lot of time on agressive,
non-useful (low-return) animals. I don't buy it.

The fact that 21st century gengineers could produce a domesticated zebra
doesn't change the lack of success modern "scientific" breeders have had
so far, or the lack of success anyone had in the past -- when it would
have been rather more useful!

} >I see no reason to assume an AI, human-level or not, would be as
} >intractable as most animals. (Apart from the non-existence problem
} Possibly. But this sounds a lot like wishful thinking.

Why? Why should an AI be hard to "domesticate"? There are evolutionary
reasons for lots of the factors behind an animal being too vicious or
panicky or territorial; there'd be no reason for AIs to have these
traits. I think a major predator of chimps is other chimps, as with
humans; AIs won't have evolved any defensive or offensive strategies.

And about breeding you write

} it stands to reason that great further improvements will be possible
} when we can cheaply read out the entire genome of each individual in a
} breeding program, match them with detailed behavioral data, and
} even splice together desired trial genomes.

It stands to reason that all of these abilities apply naturally to AIs with
no less ease, and for some versions far more ease, than to animals.

} I doubt if we've tried anywhere near as hard to domesticate smart primates
} as we've spent on A.I. research.

Is it really rational to spend as much on primate domestication as on AI
research? Yes, they're farther along, but they're still rather not
ideal, and they're hard to work with. Not all that many to work with,
for one thing. AI has a long road, but seems able to move very quickly.
At least once it finds the road.

As for hypothetical primate compensation... you pay your labor force
what it takes to get them to work, not what you think is fair. Unless
they get seriously uplifted they won't be taking greenbacks; they'd just
try to eat them. Bananas. Access to females. Not getting beaten. Or
being bred to a state where they just love to serve us, like dogs.

Now, how about AI compensation? Why should we treat an AI as a citizen
if it doesn't ask to be treated as a citizen? If it's perfectly happy to
fulfill my every wish, to pilot its fighter-body in a kamikaze attack as
enthusiastically as any glory-seeking adolescent male?

Ethics of designing an AI that way are left up to other debatants.

Merry part,
-xx- Damien R. Sullivan X-) <*> http://www.ugcs.caltech.edu/~phoenix

"Does it occur to you, the fallibility of CIT thinking? Flux-thinking.
You have prophetic dreams, remember? You can dream about a man drinking
a glass of milk. A week later you can see Yanni drinking tea at lunch
and if seeing him do that has a high shock-value, you'll super the
dream-state right over him, you'll swear you dreamed about him doing
that, exactly at that table, and even psychprobe can't sort it out after
that." -- C.J. Cherryh, _Cyteen_, Grant ALX

(speaking of such issues...)