RE: Singularity?

Eugene Leitl (
Tue, 31 Aug 1999 22:33:37 -0700 (PDT)

Billy Brown writes:

> Eli isn't the only one. I figure that whether or not this scenario happens
> wil be determined by the laws of physics, so I'm not worried about causing a

"Will be determined by the laws of physics" is a remarkably contentless statement. Everything's (apart from Divine Intervention, which most people here believe don't exist) determined by the laws of physics. So what?

> disaster that would not otherwise have occured. I am, however, very
> concerned about the potential for a future in which AI turns out to be easy,
> and the first example is built by some misguided band of Asimov-law
> enthusiasts.

Fortunately, whoever believes in fairy-tales like Asimov's laws is (due to obvious extreme incompetence) quite unlikely to bootstrap the first AI.

To make this somewhat less noise: I think rushing AI is at least as bad as rushing nano. Relying on best-case scenario (where the the first transcendee is deliberately holding back (*all*) the horses to allow everybody to go on the bus) is foolish at best. To begin, the perpetuator might be not human to start with.

And say hello to oblivion,