That's hardly surprising; both options are extremely risky. Still, when it comes to destructive potential, the Singularity wins hands down so we really shouldn't be so eager to cause one. It's like using a nuke against a riot (sort of).
> > > A much better, though still far from ideal, way would be
> > > to focus on human uploading, and when the technology
> > > is operational upload everyone involved in the project
> > > simultaneously.
>
> A suggestion bordering on the absurd. Uploading becomes possible at
> 2040 CRNS. It becomes available to the average person at 2060 CRNS.
> Transhuman AI becomes possible at 2020 CRNS. Nanotechnology becomes
> possible at 2015 CRNS.
>
> If you can stop all war in the world and succeed in completely
> eliminating drug use, then maybe I'll believe you when you assert that
> you can stop nanowar for 45 years, prevent me from writing an AI for 40,
> and stop dictators (or, for that matter, everyone on this list) from
> uploading themselves for 20. Synchronized Singularity simply isn't feasible.
-there's still a gap of 5 years between nanotech and AI, ample time to wipe out civilization if nanotech is as dangerous as you seem to assume.
-Stopping you from writing an AI wouldn't be all that hard, if I really wanted to. ;-)
-If nanotech is advanced enough to destroy the world, it can surely also be used to move to space and live there long enough to transcend. You can run and/or hide hide from nanotech, even fight it successfully, but you can't do that with a superhuman AI, i.e. nanotech leaves some room for error, while AI doesn't (or much less in any case). As I've said before, intelligence is the ultimate weapon, infinitely more dangerous than stupid nanites.
> after all, I've openly declared that
> my first allegiance is not to humanity.
No, it should be to yourself, of course. Anyway, so you're willing to kill everyone on earth, including yourself, to achieve...what, exactly?