Hal Finney wrote:
> If we look at the analogy in this way, it suggests that we may expect
> to be able to understand some aspects of posthuman behavior, without
> coming anywhere close to truly understanding and appreciating the full
> power of their thoughts. Their mental life may be far beyond anything
> we can imagine, but we could still expect to draw some simple
> conclusions about how they will behave, things which are at the level
> which we can understand. Perhaps Robin's reasoning based on
> fundamental principles of selection and evolution would fall into this
As far as I can tell, there are only three real questions about SI motivations.
If (1) but not (2), we're dead. If (1) and (2), we either turn into PSEs or stay humans forever, whichever is more efficient. (Or perhaps only the "valuable" part of us will remain...) If (2) and (3), we turn into PSEs. If (2) but not (3), we stick around in Permutation City until we grow up. If neither (1) nor (2), we probably get all the capacity we want anyhow on the theory that it encourages Singularities, or else we just get left alone with our atoms. In short, the basic interplay between these three motivations determines our survival and/or godhood.
I am pleased to announce that I see excellent arguments in favor of both sides of all three questions, which arguments change on a monthly basis, so I'm not going to bother mentioning them.
-- firstname.lastname@example.org Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.