Re: Posthuman Politics

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Oct 16 2001 - 14:13:05 MDT


Eugene Leitl wrote:
>
> On Mon, 15 Oct 2001 hal@finney.org wrote:
>
> > I wonder if there is a correlation between your own age and the number
> > of years you expect until some kind of singularity. Maybe young
> > people expect greater rates of change than people who have been adults
> > for many decades.
>
[To which Dan Clemmensen responded that he is in his fifties, and expects
the Singularity sooner than I do. So much for that. Thanks, Dan. --ESY]
>
> Additional advantage of putting the date on 2040 or 2050 is that you can
> very easily avoid looking foolish by escaping into the home for the
> elderly, or into the dewar.

Yes, this is a very important point. Some people can't bear the thought
of looking foolish. The really important thing is not to be wrong, of
course, but I too don't want to look foolish, since it might affect my
ability to gather support for causes I'm involved with. But being shown
to be publicly wrong isn't such an absolutely unbearable thought that it
drives my whole behavior. A personal inability to emotionally accept the
possibility of public embarassement is a very poor foundation on which to
build a Singularity strategy. It leads to planning for 2040 when
2005-2020 covers nine-tenths of the probability spectrum. I'll amend this
to 2005-2025 in the absence of an organized drive for the Singularity, but
that's it.

Unless the social forces overwhelm and *destroy* the technological forces
- and this does not mean wealth distribution inequities, regulations,
strangulation of capital markets and the other usual stuff; it means a
Butlerian Jihad or a nuclear war - projecting a Singularity in 2050 is
absurd. It is beyond the fringes of probability. It is not worth
planning for even on a "targets of opportunity" basis. If it happens in
2050, it doesn't happen. I say it publicly. I expose myself to the test
of fire. If I'm wrong, I'm wrong.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:13 MDT