Zero Powers wrote:
>
> >From: "Eliezer S. Yudkowsky" <sentience@pobox.com>
>
> >I think the material world is inherently neutral, which in some ways is
> >almost as bad as active hostility. I think that a lot of the problem can
> >be ameliorated by putting at least one layer of abstraction between
> >yourself and physical reality.
>
> Somebody archive that post. When SIAI has acheived its goals and Eli is
> remembered as the father of our post-technological society, generations of
> robots will remember his words and weep with gratitude.
You know, that's really not what this is about. I know it sounds amusing,
and I know that you're mostly joking, but even so that's not what this is
about. Getting a place in the history books isn't a good reason to do
something. The real lives that are touched by the Singularity are
greater, by a factor of six billion, than the one life that might be
touched by getting a place in the history books.
Even that doesn't really cover it. Getting a place in the history books
is never going to be as much fun as the smallest round of genuine
intelligence enhancement. Getting a place in the history books is the
kind of thing that people have been trying to do since before the
invention of writing. The benefits involved in the Singularity are of a
vastly higher and stronger order then the kind of material and
psychological benefits involved with having a place in the history books.
It's not just whether the desire for fame is right or wrong, but whether
"fame" and the desire for it is a temporary pre-Singularity facet of human
psychology that is overwhelmingly outgunned by the kind of forces bound up
in the Singularity. What I'm trying to say is that asking whether the
world will be grateful is not just a case of doing things for the wrong
reasons; it seems like a failure of future shock. I would rather have an
extra ten IQ points then any amount of fame, because intelligence has more
substance. Being motivated by the image of cheering crowds cuts against
the grain of the whole Singularity, and it cuts against the grain of
rationality; it means that you're being controlled by the thoughts that
generate a certain kind of pleasantness, instead of making decisions so
that your actions generate the most good.
Last but not least, there are already more people to SIAI than just me.
Brian and Sabine, Dmitriy and the entire Flare crew, all our donors... I
think that people have been associating SIAI too strongly with Eliezer.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:40:12 MDT