Re: SI: Singleton and programming

Nick Bostrom (bostrom@ndirect.co.uk)
Sun, 22 Nov 1998 23:32:42 +0000

I'm still looking for a really good definition of what a singleton is (maybe I will finally have time to finish that paper this Christman holiday?). However, I can say this about the concept I had in mind:

  1. It does not imply a unity of mind. The singleton could have one unitary mind, or it could contain lots of independent minds (e.g. human minds).
  2. It has more to do with global efficiency. Robin Hanson's paper about burning the cosmic commons in a Darwinian race to colonize space depicts a scenario that is not compatible with the singleton hypothesis since it would be globally wasteful.
  3. You may ask, efficient for what? On this the singleton hypothesis is silent. One can imagine any of a large number of global goals either of which could be adopted by a singleton (e.g. the goal to allow humans and posthumans to freely persue their goals without being coerced.)

And finally, why bother about what happens after the singularity (if indeed there will be one)? Eliezer thinks that it can take care of itself. Well, I think that for all we know, several different post-singularity paths may be possible, and which one will actually be real might depend on human choices between today and the final moments before the singularity. We therefore want to understand what the possibilities are so we can try to bring about the one we like best.

Nick Bostrom
http://www.hedweb.com/nickb n.bostrom@lse.ac.uk Department of Philosophy, Logic and Scientific Method London School of Economics