Paradox--was Re: Active shields, was Re: Criticism depth, was Re: Homework, Nuke, etc..

From: John Marlow (
Date: Thu Jan 11 2001 - 21:45:14 MST

Okay, call me self-aggrandizing, but this has for some
time been my take on entrusting our fates to machines:

Marlow's Paradox:

“We cannot entrust our fate to machines without
emotions, for they have no compassion; we cannot
entrust our fate to machines with emotions, for they
are unpredictable.”

Basically, we can't trust machines.

Right now the armed parties are spread across the
globe. True, the US could hammer the rest of the world
combined--but would suffer immense damage in doing so.

Handing all of the major hardware to a single, INHUMAN
party is insanity. Absolute, unmitigated lunacy.
Anything purely logical would exterminate us as
unpredictable and dangerous. Anything emotional is
itself unpredictable and dangerous.

To top it off, no nation is going to willingly fund
and imlement an impartial, all-powerful AI. Any nation
funding or implementing such a thing will make damned
sure it IS partial--to them. And probably that it has
backdoors as well.

john marlow

--- "Eliezer S. Yudkowsky" <>
> John Marlow wrote:
> >
> > Well, uhmm--creating something smarter than we
> are,
> > and then handing it all the weapons, doesn't
> strike me
> > as a particularly bright idea. To say the least.
> It strikes me as a brighter idea than handing all
> the weapons to a human.
> We *know* what humans do with weapons.
> Humanity has to deal with greater-than-human
> intelligence one of these
> days. If transhumanity is benevolent, we're home
> free; if not, we're
> screwed; that's it. The future looks like this:
> Nanotech followed by >H:
> Two risks: A successful active shield can't build
> a Friendly AI.
> >H followed by nanotech:
> One risk: A successful Friendly AI can help deal
> with nanotech.
> -- -- -- --
> --
> Eliezer S. Yudkowsky
> Research Fellow, Singularity Institute for
> Artificial Intelligence

Do You Yahoo!?
Yahoo! Photos - Share your holiday photos online!

This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:18 MDT