Re: Eugene's nuclear threat

From: J. R. Molloy (jr@shasta.com)
Date: Mon Oct 02 2000 - 20:04:07 MDT


Eugene Leitl wrote,

> I want to achieve transcension by not dying, thankyouverymuch.

Don't we all.
"I don't want to achieve immortality through my work. I want to achieve it
through not dying."
-- Woody Allen

> Duh. Do you think I'm a monster?

Nooo... You're the monster slayer.

> Sure, but no matter what you do, a few of the bright babies wound up
> pathological individuals (blame evolution), and would put their
> brightness to destroy themselves and us in the process.

And they would destroy AIs too.

> To show you all how truly evil I am, I propose for an early screening
> program, identifying such people (thankfully, brilliant sickos are
> quite rare), and locking them up where they can't hurt themselves and
> others.

Do you have a preference as to the facility that will contain your "truly evil"
self?

> Sounds like a good plan. But people don't buy into delayed
> gratification, so it has to work in small increments.

Oh yeah, right. That explains why bankers rule the world. They just don't buy
into delayed gratification. And of course no one ever puts off gratifying their
desire to own a Mercedes until they finish medical school. That's why no one
ever finishes medical school.

> Our firmware is not made to cooperate in large groups, and deal with
> extreme threats. We're not smart and rational enough for that. If
> there ever was a use for human germline engineering, it's to boost our
> EQ and IQ.

Best argument I've read here in favor of developing AI which would result from
boosting human EQ and IQ.

--J. R.

It takes time to ruin a world, but time is all it takes. FONTENELLE



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:14 MDT