Re: Why would AI want to be friendly?

From: J. R. Molloy (jr@shasta.com)
Date: Fri Sep 29 2000 - 15:36:26 MDT


Eugene Leitl writes,

> An immature upload is a full-detail emulation of a human's low-level
> neuronal processes (at, say, compartmental emulation level) based on a
> data set more or less imediately derived from digitized
> neuroanatomy. Apart from being potentially immortal and having
> a radically expanded control of itself and it's (rendered)
> surroundings it is very human.

It is very human and very hypothetical, since nothing like this presently
exists, right?

> A mature upload is something which has made extensive edits to the
> above dataset (or is derived from above original dataset in a
> darwinian fashion), and is encoded in a fashion which runs more
> efficiently on the (computronium) hardware (both the hardware and the
> encoding will have to undergo several optimization steps for this to
> succeed). It is much faster than an immature upload (operates on a
> time scale 10^3..10^6 faster than we), and requires a fraction of its
> resources to operate.

Since "Computronium" is a hypothetical substance in which each processing cell
of a CA is reduced to atomic scale and arranged in a crystal lattice we can rule
this out as a viable element of any actual AI that could currently evolve from
genetic programming, right?

> Another definition is that a population of mature AIs can't get
> snuffed out by AIs created by evolutionary algorithms. They can
> persist in a stable equilibrium with them (nice trick, that).

Well, they *could* perist if they had any chance of becoming real (nice fantasy
that).

> There is supposedly a traversable development trajectory from immature
> to mature. Since it's unlikely any of us will be a player in that
> game, we will be essentially dependant on their freely flowing milk of
> posthuman kindness for 1) protecting us from AIs 2) not snuff us out
> instead of AIs

Now I see why you don't trust AI. You're afraid it might unravel your
suppositions and push your face into the reality that life is not a game.

> > (And what's wrong with letting the Amish go extinct?)
>
> In principle, nothing. However, being human, I would object.
> They should be given a choice, those who reject the choice I would
> still upload by force (now things do become ethically iffy), erasing
> their short-term memory of what just happened and given an illusion of
> subjective continuity of existance. As long as I can spare the
> resources, they would be free to pursue what they did before, of
> course being given opportunities to leave their self-imposed seclusion
> at any step of the game, and certainly if/when I can't afford the
> resources.

That's very generous of you. I suppose you would also upload Hitlerists,
Stalinists, mass murderers, child molesters, and other offshoots of human
deviation and defective brain function (nice collection that).

> > So we should speed the occurrence of mature uploads.
>
> Yes. I think in 10-15 years we should be able to make individually
> realistic computational models of nematodes, provided we establish and
> fund a large project now. After that, you have to scale this up to
> higher animals in a series of steps. (E.g. C. elegans,
> D. melanogaster, <some suitable intermediate here>, M. musculus,
> H. sapiens).

Thank you for providing yet another excuse for me to get drunk. I knew you were
intelligent, but I had forgotten that intelligence often degenerates into
insanity (as it has in your case).

> Once we can simulate nematodes, perhaps people will start seeing value
> in developing means for vitrifying brains of terminal and freshly dead
> people right in the here and now. Meanwhile, several thousands of
> people per hour will be continuing dying irreversibly while we will
> continue waffling about AI, SI, nanotechnology, and properties of
> metallic xenon and unicorns, and whether their horns are really
> Mohs>10.

Okay, now I better understand why the military has such a huge investment in AI
(this just confirms what I already knew). The world has filled up with unhinged
minds and loose cannons of speculative thought. Sane people will need all the AI
we can muster in order to properly manage the onslaught of crazed engineers who
have no clue about their own detachment from reality.

"When we remember we are all mad, the mysteries disappear and
life stands explained." --Mark Twain



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:39:27 MDT