Re: Goo prophylaxis

Nicholas Bostrom (
Sun, 31 Aug 1997 15:31:03 +0000

Carl Feynman wrote:

> At 02:29 PM 8/29/97 +0000, Nicholas Bostrom wrote:

> >... The superintelligence
> >realizes this and decides to junk the other computers in the
> >universe, if it can, since they are in the way when optimising the
> >SI's values.
> Throughout the 'goo prophylaxis' thread, you seem to completely ignore the
> possibility that trade may be more profitable than conquest, that employment
> may be more profitable than enslavement, and that symbiosis may be more
> profitable than extermination. I don't see why the principle of comparative
> advantage should not continue to apply even in a world with vast disparities
> in material power between different actors.

Ok, let's make a quick cost-benefit analysis for a SI whether or not
to destroy another budding SI who refuse a negotiated merger.

Destruction (Kill the tyrrant in the craddle)

(1a )Some resources has to be deployed for a brief time while the
rival is destroyed.

(1b) If the rival has been allowed to reached a fairly advanced
stage, then the operation might incur some damages which will take a
little while to repair.

(2) We also lose all labor output that the rival could have done and
from which we could have benefitted through trade.

(1) Instead of having to share the universe with our rival, we get it
to our selves. This means we gain 0.5*(resources in the part of the
universe that will ever be colonized). This is a *huge* benefit.

(2) We eliminate the risk that the rival will one day try to destroy
ourselves. Thus, untill we meet advanced Extraterrestrial
civilizations, we get the benefit of total safety from external

Costs (1a) and (1b) are neglible compared to benefit (1). Cost (2) is
also smaller than benefit (1), because we can use the resources we
conquer to produce the same amount of output as our rival would have
had, and this output will now be ours, we won't need to buy it first.

Even without benefit (2), the benefit side far outweighs the cost
side. The only plausible considerations that could change this would
be either a balance of terror involving total mutual annihilation, or
the inclusion of specific sorts of strong ethical motivations

> --CarlF
> PS. What a great thread!
Yes isn't it!

Nicholas Bostrom

*Visit my transhumanist web site at*