Re: Singularity: Just Say No!

Anders Sandberg (
17 Jan 1999 13:55:33 +0100

"Chris Wolcomb" <> writes:

> Rather than re-post your thoughtful comments, I'll just respond by
> saying that you are refreshing as cool lemon aid on a hot dry day.
> Thanks. :-)

Be careful about saying things like that, my ego may grow so much that fusion processes start in the core :-)

> The Future is up for grabs people! Nothing is written in Stone. It
>would be very refreshing, to echo Anders sentiments, to see some
>rational, pro-active descriptives and strategies for evolving into
>somehting other than a borganism.

It is a bit ironic that borganisms are so often suggested, since they appear to be harder to implement than individuals. In very complex individual the concept of self will likely be rather complex (in some sense we are borganisms already, collective minds made up of semi-independent brain systems), but connecting minds evolved to be individual in an useful way is likely rather complex; it is likely easier to extend them instead.

I agree that we need some different scenarios to look at. There were a few scenarios mentioned in the technology thread a few weeks back based on the relative arrival of AI, nanotech and hardware; while they just ended up in the big Singularity discussion, I think the approach is good. We could list the potential technologies and methods available, figure out an approximative dependency tree (i.e. "If we have reliable nanotech, massively parallel CA-style computers become easy and cheap") and discuss the consequences. Most likely this can get extremely complex, so we should either try to run it as a scenario planning project or find some useful hypertext/critlink model to keep track of all the links.

Anders Sandberg                                      Towards Ascension!                  
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y