Eliezer S. Yudkowsky wrote:
> I really do think that when all is said and done, predicting our
> treatment at the hands of the Powers is a fifty-fifty coinflip. I just
> don't know. What I do know is that the near-future parts of the
> probability branches indicate that, after the preconditions are taken
> into account, this coinflip chance is larger by an order of magnitude
> than all the other happy outcomes put together..
I think its interesting how one can arrive at the same conclusion from a completely different direction.
Personally, I would give us a very good chance of surviving the emergence of ultratechnology even without and early seed AI success. The kind of technology progression we would get in that situation looks like something humans could cope with, and the emergence of Powers would be gradual enough that there would never be a single invincible individual.
However, I also don't think there is much chance of the Singularity being a bad thing, from the human POV. I've heard lots of scary-sounding "What if" stories on this topic, but nothing that even comes close to making sense. If IE is really so easy that a seed AI can become a Power all by itself in a short period of time, its going to go from nanotech to something more exotic before we even notice the change (femtotechnology? reality engineering? who knows?). I won't pretend to know what it will actually do at that point, but I can't see it being concerned about something as prosaic as its supply of atoms.
If, OTOH, IE is not that easy, then there is never going to be a single Power. Instead, we'll get a society of different kinds of Transhuman minds working to improve themselves as a group. That effectively puts us back in my first scenario, but with a faster rate of change and even less chance of disaster.
So, whichever way it works out, anything we can do to speed up progress (especially progress on IE) is a good thing. The longer we take to reach practical immortality, the more people will die before we get there.
Billy Brown, MCSE+I