Re: CRAP: RE: Kingdomality

Anders Sandberg (asa@nada.kth.se)
30 Jun 1999 00:09:19 +0200

Here is an initial sketch for a transhumanist profile test. It is not a personality test, rather a kind of attitude test towards what kind of transhumanist development one strives for.

Dimensions:

	Individuality (Individual-Borg): is your ego and its
	persistence paramount, or are you unafraid of dissolving it
	into a borganic nirvana?
	
	Enhancement level (Natural - Artificial): How far are you
	willing to go when modifying yourself?
	
	Rationality (Bottom up - Top down): How willing are you to
	accept emotional, intuitive and other pre-rational influences
	on your thinking, or should it be controlled and logical?
	
	Nature (Steward - Extropian in Jaron Laniers notation): Should
	nature / the universe be left largely as it is for its own
	sake, or should we change things?
	
	Change (Caution - Fast forward): Should development be tightly
	controlled or cautious, or is the best approach to let things
	flow (or even speed them up).

I'm not entirely happy with squeezing multidimensional views into scalars (most of these dimensions are really multidimensional). But this is after all just a rough approximation of real views, trying to get the overal drift of our ambition.

Using a Meyes-Briggs like system we end up with 2^5 = 32 views.

OK, here are some questions for the different dimensions.

Individuality:

	Do you regard the survival of your self as the single most
	important goal of your existence?
	
	Do you value community over individuality?
	
	Enhancements:
	
	Would you retain some biological characteristic even if it
	could be removed or enhanced, and it did have some small
	negative impact on you? (e.g. sweating or sleeping)
	
	Do you value being postbiological over biological?
	
	Rationality:
	
	Does emotional reactions threaten rational thinking?
	
	Would you edit away unnecessary emotions?
	
	Would you sacrifice some of the rigor of your thinking for
	greater creativity?
	
	Nature:
	
	Should we leave Mars as it is, or should we terraform it?
	
	Does biodiversity have a value in itself and not just for
	utility?
	
	Change:
	
	Should we speed the development of radical technologies such
	as AI, nanotechnology or life extension as much as possible,
	regardless of the risks?
	
	Are there technologies that are so risky that their
	development should be strongly curtailed?
	

I'm sure you can figure out much better questions. We need several for each category and direction to get any precision.

Here are the categories, and some ideas for names:

BATEF - The Borg Collective
IATEF - Power
BNTEF
INTEF
BABEF
IABEF
BNBEF
INBEF - Prometheus
BATSF
IATSF
BNTSF
INTSF
BABSF
IABSF - Jupiter Brain
BNBSF - Teilhard de Chardin's Omega Point INBSF
BATEC
IATEC
BNTEC
INTEC
BABEC
IABEC
BNBEC - Stapledonian minded planet
INBEC - Greek god
BATSC
IATSC
BNTSC
INTSC - Spock
BABSC
IABSC
BNBSC - Culture
INBSC - Aristoi


Anders Sandberg                                      Towards Ascension!
asa@nada.kth.se                            http://www.nada.kth.se/~asa/
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y