Re: Why would AI want to be friendly?

From: J. R. Molloy (jr@shasta.com)
Date: Mon Sep 25 2000 - 20:38:04 MDT


Matt Gingell writes,

> The best bad option is better than the rest, I suppose, and if seed
> AI ends up being the way that works then it certainly can't hurt to
> program some affection into it. But it's rather like thinking a
> friendly, symbiotic, strain of bacteria is likely to eventually
> evolve into friendly people. The first few steps might preserve
> something of our initial intent, but none of the well-meaning
> intermediates is going to have any more luck anticipating the
> behavior of a qualitatively better brain than you or I.

We really can't *know* if a friendly strain of bacteria is likely to eventually
evolve into friendly people without actually conducting an experiment along
those lines. Plainly, such an experiment with artificial digital life forms
would take minutes instead of millenia, so that's not a problem.

--J. R.

"A diamond is a chunk of coal that made good under pressure."
Anonymous
[Amara Graps Collection]



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:39:04 MDT