> On Sep 5, 4:54pm, "Nicholas Bostrom" wrote:
>
> } Yes, that would be a very bad outcome. In the EoC, Drexler mentions
> } the possibility that a state choose to get rid of its people and
> } replace them with obediant AIs. This is a real danger. -Another
>
> Actually I thought this was a ridiculous idea, as stated. A single
> dictator, or cabal, might try this, but this is a subset of "nanopower
> tries to take over the world". Even in massively totalitarian states
> there is a strong connection between the rulers and a large mass of the
> people -- in fact, one might argue that especially in modern totalitarian
> states is this the case. Hitler might have replaced Germans with AIs if
> you let the idea sit in his cracked head for long enough, but he
> wouldn't have right away. I doubt the Chinese rulers would replace the
> Chinese. I don't think these people think entirely in terms of slave
> labor, and replacing one's own people by robots would frighten or creep
> out most members of the human race. The world is not run by psychotic
> extropians.
And then there's the psychological factor. Dictators don't get their kicks
from telling obediant AI's what to do, they get their kicks from breaking the
free will of _real_ people.
--Wax