Seriously, if we wanted to this properly we would create a large questionaire with questions like "Would you replace the personality you have with a more efficient one?", "I want to be immportal", "The individual is the basis for all ethics" etc, let many people take the questionairre and then do a factor analysis afterwards to see what main factors influence our thinking. But that requires more work than I have the time for, even if it would be *very* interested.
"Eliezer S. Yudkowsky" <firstname.lastname@example.org> writes:
> Anders Sandberg wrote:
> > Individuality (Individual-Borg): is your ego and its
> > persistence paramount, or are you unafraid of dissolving it
> > into a borganic nirvana?
> This divide confuses a personal preference for individuality with
> persistence of mortal priorities. I'd like to be an individual -
> telepathic, but individual - but that doesn't mean my persistence is "paramount".
Yes. So there really should be a mental dimension (how important is the persistence of my ego) and a social dimension (how much should dividuals be merged?).
> > Enhancement level (Natural - Artificial): How far are you
> > willing to go when modifying yourself?
> Anyone who's a transhumanist wants to modify themselves. I'd augment
> this with "Augment: Body-Mind - which do you want to augment?"
> Questions like "Would you like to be able to do the stuff seen in _The
> Matrix_?" and "Would you like to be able to download the Internet into
> long-term memory?"
As I was thinking, it would distinguish people who want "just" to become immortal, happy, wealthy and very smart from us who want to become jupiter brains or intergalactic search engines. "Weak vs. Strong posthumanity".
> > Rationality (Bottom up - Top down): How willing are you to
> > accept emotional, intuitive and other pre-rational influences
> > on your thinking, or should it be controlled and logical?
> Well, you have to be able to put in a * for "I don't care" or "I think
> this distinction is falsified by cognitive science." I'd have no
> problem whatsoever with leaving this off the list entirely. (There is
> no distinction, not in my AI, not in humans. "Logic" is built from
> intuitions, "intuitions" are built from logic. There are only
> problem-solving methods.)
Maybe. I'm not that happy with it either, but I wanted something to distinguish between the Spock meme/classic AI thinking and fluffy connectionism. But it is a bit loose.
> Here are other dimensions you might break it into, although I have the
> feeling that all of this stuff is so intertwined...
That's why we really need a factor analysis.
> Goals (complex dimension): Which of these is most important to you:
> Truth, Intelligence, Joy, God, Freedom, Law.
> Individuality (Singleton - Collective): How much does the idea
> of telepathy appeal to you? How important is maintaining
> continuity with your mortal self?
> Morality (Arbitrary - Externalist): To what degree are you willing
> to sacrifice your mortal quirks and priorities if it turns out to
> be the logically correct thing to do?
> Augmentation (Body - Mind): Which is more important to you:
> Being able to explore the galaxy without a spaceship, or being
> able to hold the whole Internet in your short-term memory?
Good ideas. I'll see if can come up with a 2.0 version of my writeup.
Anders Sandberg Towards Ascension! email@example.com http://www.nada.kth.se/~asa/GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y