Xiaoguang Li wrote:
> here's my understanding of Eliezer's argument for actively
> pursuing the creation of a power:
> destruction is easier than construction: given humanity's
> lackluster record with past technological advances, nanowar (read:
> armagedon) is virtually certain.
> evolution is messy: even if humanity survives the advent of
> nanotech and gives rise to transhuman powers, these powers would have no
> more chance of being benevolent than powers designed de novo. the
> dichotomy -- if transhuman powers retained evolutionary baggage such as
> emotional attachments or moral inhibitions, then they are prone to
> inconsistencies and self-contraditions and are therefore untrustworthy; if
> they did not, then initial conditions are insignificant, and there's no
> functional difference between transhuman powers or powers who arise
> otherwise. the caveat -- AI is easier than IA, so this scenario would
> require active suppression of technology, the ills of which are
> power before nanowar: creation of a power designed de novo before
> nanowar will result in a singularity -- that is, all bets are off (read:
> no more house advantage against the survival of humanity).
So far, so good.
> now for my doubts. does the creation of a power really increase
> the chances of our survival? it seems that the odds of humanity
> surviving a singularity are significantly lower than a coin flip. given
> the enormous difference between the creation and its creator, it seems
> most likely (> 99%) that humanity would not be significant to a power.
> that is, perhaps less significant than virion particles to a human
I honestly don't know. I really don't. I certainly wouldn't put the probability at lower than 10%, or higher than 60%... but, ya know, I could be missing something either way. I think we're missing a fundamental part of the question. There isn't *any* outcome I can see for the Singularity, not one of your three possibilities, that could conceivably have happened a few hundred thousand times previously in the Universe... which is what I would regard as the absolute minimum number of intelligent species to expect, given that some galaxies are already so old as to be burned out.
What I am fairly certain of, and what my intuitions agree with, is this: