Anders Sandberg wrote:
>
> This of course doesn't mean we need to all dress up in suits and ties,
> but we better get our act together. If there is anything that really
> worries me it is uneducated transhumanists. Not in the sense of not
> having academic degrees, but in the sense of people making claims
> without any support, loudly assering various notions and claiming them
> to be transhumanism without exploring how well they fit with reality,
> transhumanist thinking and other known information. That kind of
> behavior weakens transhumanism, and provides something that is easy
> for opponents to attack ("Mr X, the transhumanist, said that the goal
> of transhumanism was to wipe out anybody who won't become
> superintelligent").
>
> Fortunately, I think this is curable. Both by holding "internal"
> courses on our own subjects for transhumanists (this is a good
> activity for our organisations - informal courses/seminars/discussions
> on (say) how nanotech really would work or the realities of economics)
> and by having us, as transhumanists, realise the need for
> self-transformation and self improvement. After all, we want to grow
> ourselves as people, and one of the best ways of doing that is
> educating ourselves and becoming better communicators.
Good intentions - check. Good idea on general principles - check. Will solve stated problem - nope.
The problem is, well, gaussian. The bell-curve thing. You will never, ever be able to eliminate the transhumanist kooks, or even educate them, because there will always be a residuum of complete morons who combine your Web pages with their private beliefs and go on to wreak havoc in the media, immune to all reasoning or volume control.
I don't know if this problem has a transhumanist solution, but I do think it has a Singularitarian solution. You cannot out-extremist an Externalist Singularitarian. I can - training, experience, and opportunity would be required in practice, but I'm talking about possibilities - go on the Jerry Springer show with a raving, high-volume lunatic who equates the Singularity with the Kali Yuga and overshadow him without raising my voice.
"If computing power doubles every eighteen months, what happens when computers are doing the research? If technology is produced by intelligence, what happens when intelligence is enhanced by technology?" In less then twenty years, all life as we know it will have come to an end, for causes that lie completely within the laws of physics. Or the Q&A sessions: "What if your AI wipes out humanity?" "If a greater-than-human intelligence judges humanity to be unnecessary, I do not consider myself competent to challenge it." Et cetera. I've found that I can shock the living daylights out of most "mundanes" in a few minutes. I don't think a mock-Singularitarian could.
In the end, I don't think the lunatics will have the intelligence or the comprehension-of-the-future to generate the alienness or the future shock that surrounds the Singularitarian meme. They can't compete on shock value, and in the end, that's all junk television cares about. I believe that my rational discussion of the technological end of the world will shock newscasters more than any apocalyptist cult can, because cults can steal the terminology and the surface outcomes, but they can't steal the alienness.
For you transhumanists, I would simply recommend putting up one person as "leader"; I'd say Max More. The tendency to give more media time to loonies can be counteracted by the tendency to give more time to leaders. If there's a clear reference point that mediafolk can understand - that means, "Max More big kahuna", not "The Extropian Principles are available on the web at..." - then the principles tend to get less distorted. Same reasoning as with Eric S. Raymond. He didn't invent it, but if the open-source guys didn't have an archetypal figurehead to wave in front of the media, ESR knew damn well that Microsoft would be labeling Windows 2000 "open-source" inside of a month. So ESR elected himself as media spokesperson and did a fairly good job.
I think that Singularitarianism should be able to get by without a IAer-per-se as a *literally* archetypal figurehead; maybe Mitchell Porter or Marc Steigler or someone else will wind up doing it. If someone else wants the limelight, and the byline on the Web pages, I'm all for fading into the backlight and working on AI instead, saving my mutant-supergenius status as a trump card. Realistically, though, the research guys often do wind up doing the PR.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way