RE: Yudkowsky's AI (again)

Lyle Burkhead (lybrhed@earthlink.net)
Mon, 29 Mar 1999 14:44:13 -0800

Eliezer writes,

> Business: I believe we do have some high-financial-percentile folks
> reading this list. I would like to see you post... a list of
> what you're interested in funding (... Extropian business ideas? ... )

I tried that almost three years ago. No response. Extropianism isn’t about making money.

In another post Eliezer writes,

> The most realistic estimate for a seed AI transcendence
> is 2020; nanowar, before 2015. The most optimistic estimate
> for project Elisson would be 2006; the earliest nanowar, 2003.

To which den Otter replies

> Conclusion: we need a (space) vehicle that can
> move us out of harm's way when the trouble starts.
> Of course it must also be able to sustain you for
> at least 10 years or so. A basic colonization of Mars
> immediately comes to mind.

You have no idea how bizarre this discussion appears to an outsider. You guys are as far out of touch with reality as the Scientologists. Maybe more.

> So only the first human to Transcend winds up as a Power.

And it’s not going to be you. Nobody needs to sabotage your efforts. You are not capable of making an effort. You are too far out in fantasyland to have any effect on the physical world.

Anders writes

> I think a lot of the discussions about the emergence
> of Powers and SI are hampered by a lack of
> what Lyle Burkhead calls 'calibration' - are these ideas
> really checked against reality? Technological development
> does not necessarily jump, there are always economics involved,
> humans often act together in a social manner, if somebody
> has technology/knowledge X then it is very likely that
> many has it or are close to it too, Hollywood memes
> are not necesarily true and so on. I think I need to write
> something like www.geniebusters.org for uploading and the brain :-)

to which Billy Boy replies,

> Maybe you should - although I hope you'd do
> a better job of it than he did. Mr. Burkhead seems to be
> fond of making sweeping statements about what is possible
> in fields he obviously doesn't know anything about...

This kind of thinking weakens you. This is not the way to see reality clearly. On a battlefield, in business, or anywhere, the one who sees clearly wins. Our way of thinking (“calibration”) is exemplified by the geniebusters site. It strengthens us. It does lead to clear perceptions.

We will meet in a few years -- bishop takes pawn, knight forks queen and rook, checkmate.

Lyle