Re: Colossus and the Singularity

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jan 25 2001 - 23:02:19 MST


SPOILER ALERT

Jim Fehlinger wrote:
>
> I'm thinking
> particularly that the unfolding of events in this movie fits Eliezer Yudkowsky's
> characterization of the Singularity as being initiated by a runaway positive
> feedback loop. I hadn't fully appreciated this before. However, in the movie
> it's an accidental Singularity, not anticipated or planned either by Colossus'
> designer Charles Forbin or the computer's primary user, the U. S. government.

> The subject matter of this movie is probably too dark ever to be
> successfully parodied on _Mystery Science Theater 3000_

Actually, the Singularity Institute's Board of Directors just recently
watched this movie. All three of us. Silhouetted against the screen.

Leaving aside such MST3K-type material as watching a punched-tape computer
depicted as an AI (roughly equivalent to seeing a propeller airplane
flying through space), I would not say that I was sympathetic to Forbin
through this movie. It looked like something that would appear on a Fox
video special called "World's Most Predictable Disasters".

To surf out the Singularity, you accept the fact of a hard takeoff, design
for it, and plan for it. You don't build a huge self-improving AI and
then act all *shocked* when the AI improves itself beyond recognition.
That's just blind gibbering incompetence.

To recall some of the MST3K-type comments heard:

> DR. BLAKE MILLER: Well, we're still boss.
>
> FORBIN: Are we?

Eliezer (in Forbin's voice): "That didn't take long, even by our
standards."

> CLEO MARKHAM: It's not your fault!

Eliezer (exasperated): "Yes it is!"

> Perhaps the most reasonable interpretation is that Forbin is
> a deeply divided man, unable to get past his ambivalence toward
> the long-term implications of his chosen career (an ambivalence something
> like Bill Joy's angst, or as professed by Hugo de Garis and Kevin Warwick).

People who enjoy marinating in the melodramatic angst of being deeply
divided shouldn't become neurosurgeons, generals, futurists, or Friendly
AI programmers.

This is a high future-shock profession, and the part of the job is being,
if you'll pardon a "Men In Black" metaphor, the one asking the aliens for
their passports while everyone else is running around screaming.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:25 MDT