"Image problem" is the wrong phrase, perhaps, Eli. I didn't use that phrase
and wouldn't have.
My point is: Transhumanists **should**, on the merits of their ideas, be
widely recognized as intellectual and cultural leaders, but, at this stage,
they are generally not. One of our goals should be to change this.
If they were, then their opinion on things like human cloning would be taken
more seriously by the public at large.
Perhaps my intuition as to how this goal might be achieved is faulty, that's
quite possible. But anyway, my goal is not so much to solve an "image
problem" as to project transhumanists to a leadership position on the world
As for my article in FAZ, I would say that the attitude YOU are displaying
in this e-mail exchange, my friend, is pretty close to the attitude I was
writing about in that article. Basically, Eli, as much as I like and admire
you, the attitude "Let others worry about the starving people in the Sudan;
my job is to save them by building a Friendly AI that will produce the
Singularity" sort of gets on my nerves. Perhaps this is purely my own human
irrationality. Something in my heart tells me that we'll be better
people -- and produce friendlier AI's -- if we allocate some of our personal
energy to tangible compassion in the here-and-now as well as trying to help
people through long-term technological projects. But this isn't the kind of
thing for which one can make a logical argument.
So, I'll shut up now ;>
> -----Original Message-----
> From: email@example.com
> [mailto:firstname.lastname@example.org]On Behalf Of Eliezer S. Yudkowsky
> Sent: Saturday, April 28, 2001 7:29 PM
> To: email@example.com
> Subject: Re: Maximizing results of efforts Re: Mainstreaming
> Ben Goertzel wrote:
> > It would appear that you entirely missed my point, which was that if
> > transhumanists want to improve their public image, then they should do
> > things that are associated with goodness in the public mind.
> Transhumanism doesn't *have* a public image problem with people thinking
> it's "uncharitable". Implausible, yes, but not uncharitable.
> One person hanging out with a refugee from the Soviet Union, getting
> ticked off at his extreme anti-sharing attitude, mistakenly associating it
> with transhumanism in general, and writing an article in FAZ, is not an
> "image problem". In fact, I'd go so far as to say that this would need to
> happen at least three times before it became an "image problem".
> -- -- -- -- --
> Eliezer S. Yudkowsky http://singinst.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:00 MDT