Re: Eliezer S. Yudkowsky' whopper

From: Andrew Lias (anrwlias@hotmail.com)
Date: Wed Oct 04 2000 - 11:02:59 MDT


Spudboy100@aol.com write:

>I think that technological predicitions for the last 20 years have erred on
>the side over-optimism, in the same sense that we'd have colonies in space
>by
>the year 2000 and moving sidewalks. Following that conjecture the
>scientific
>advancements will continue, but never as quickly as desired by the
futurists/transhumanists/extropians.

I'll agree that these sorts of predictions will almost, inevitably, be
wrong, but I also think that it's just as dangerous to predict that these
accounts are overly optimistic as it is to declare that they are overly
pesimisstic.

Technological development is a funny thing. Some things that seem like a
slam dunk (e.g., space colonies) prove to be inviable in the predicted
timeframe (either because technology doesn't advance fast enough, or because
of other issues that retard the tech). On the other hand, some technologies
spring up and bite us long before we ever expect them.

Personal computers are a great example of the latter. Every science fiction
writer in the golden age was envisioning moon colonies (if not actual
cities) by 2000, but none of them pictured computers as being anything less
than gigantic.

The simple fact is that the future is unknown and, beyond a very limited
range, unknowable. Making predictions is a risky business at best. The
strong AI community has repeatedly embarrassed itself by making
over-confident estimates about when strong AI would show up. As a
consequence, a lot of people outside of the AI community consider the whole
venture to be a joke, regardless of the merits of any particular line of
research.

I think that we all agree that we are approaching a point where an explosion
of intelligence will become a viable possibility. I think, however, that
it's rather foolish to assert that it will happen before a given time. It
*might* happen by 2020. It also *might* happen in 2520. It also *might*
happen tomorrow, if some obscure line of research that nobody is aware of
hits the jackpot. We simply don't know what factors may advance or retard
the date.

My own concern is that there is a reasonable possibility that we'll hit the
Singularity in my lifetime and that it may take a form that will exclude my
interests. My primary interest is that the human race doesn't fall into (or
get dragged into) a non-continuitive extinction (i.e., an extinction where
we simply cease, rather than transcending to something more intelligent and
capable). My primary concern is that the only thing that we can control,
when it comes to such a singularity is the initial conditions. I can think
of all to many scenarios where I, personally (to say nothing of the species
as a whole), either get left behind or destroyed. Unfortunately, there's a
whole bunch of those.

It is my hope that we will be able to see *just* far enough ahead that we
don't just blunder into the damned thing (ask me about my monkey scenario!
;-). One thing that seems certain to me is that there seems to be a lot of
unfounded speculations regarding the morality and rationality of
post-organic hyperintelligence. It seems that the only same position to
hold in an arena that harbors such beings it to *be* such a being (and even
that might be presumptive -- we are assuming that amplified intelligence is
a good thing to have; it's possible that hyperintelligences are prone to
fatal or adverse mental states that only manifest beyond a certain level of
complexity; after all, we do know that certain pathological mental states,
such as a desire for self-martrydom, only show up at human levels of
intelligence).

Frankly, the notion that we are approaching a Singularity scares the hell
out of me. If I thought that there were some viable way to prevent it or
even to safely delay it, I'd probably lobby to do so. I'm not convinced
that there are any such options. As such, my personal goal is to be an
early adopter and hope that self-amplification isn't the mental equivilent
of jumping off of a cliff.
_________________________________________________________________________
Get Your Private, Free E-mail from MSN Hotmail at http://www.hotmail.com.

Share information about yourself, create your own public profile at
http://profiles.msn.com.



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:15 MDT