RE: Yudkowsky's AI (again)

Nick Bostrom (bostrom@ndirect.co.uk)
Sun, 28 Mar 1999 00:50:24 +0000

Billy Brown wrote:

> However, I also don't think there is much chance of the Singularity being a
> bad thing, from the human POV. I've heard lots of scary-sounding "What if"
> stories on this topic, but nothing that even comes close to making sense.
> If IE is really so easy that a seed AI can become a Power all by itself in a
> short period of time, its going to go from nanotech to something more exotic
> before we even notice the change (femtotechnology? reality engineering? who
> knows?). I won't pretend to know what it will actually do at that point,
> but I can't see it being concerned about something as prosaic as its supply
> of atoms.

Why not? If it is better off (however slightly) with these atoms than without them, then in this scenario could all be dead, unless we have been wise enough to make sure that the power is ethical.

Nick Bostrom
http://www.hedweb.com/nickb n.bostrom@lse.ac.uk Department of Philosophy, Logic and Scientific Method London School of Economics