Re: Uploads and betrayal

Charlie Stross (charlie@antipope.org)
Mon, 29 Nov 1999 09:58:03 +0000

On Sun, Nov 28, 1999 at 03:00:48PM -0600, Eliezer S. Yudkowsky wrote:
> > and you have a pretty strong argument
> > for extinction. Now, where does that 70% figure come from??
>
> So why are we still here? This has to have happened at least once
> before in the Universe.

So postulate that it happened but it's still outside our light cone. (C.f. collapse of the vacuum state.)

> > > Trust me: I don't think I'm infallible.
> >
> > But nonetheless you are prepared to act as if you *were*
> > infallible...The moment you activate your ASI Golem, you
> > and you alone will have passed judgement on the world,
> > using your finite wisdom.
>
> Like I said in the other post, the SI passes judgement. I'm not the
> smartest being in this chain of causality.

Hate to but in so late, but I find the whole framework of this debate somewhat disturbing.

Even without invoking silliness like cartesian dualism or Roger Penrose's theories it is important to bear in mind that we don't *yet* know enough to know if uploading is even possible. F'r'ex, suppose significant computational work is being done by the microtubule skeleton within each neuron; all we get to see (and base our estimates of the complexity of the brain on) is a signal emerging from a much more complex system, while critically important things like long-term memory are encoded internally. _That_ sort of mistake could jack up the computational complexity of the human brain by, say, ten or twelve orders of magnitude, and I wouldn't rule it out until we _know_ how long-term memory is encoded. (I'm not current with cutting-edge neurobiology, to say the least, but I'm under the distinct impression that this isn't a cut and dried issue.) More to the point, such an "oops" mis-estimate jacks _down_ the intelligence of an SI in relation to ourselves to an equivalent degree.

Then we have the question of _why_ an uploaded SI might take action detrimental to not-yet-uploaded people. If Moravec (in Robot) is correct about the computational complexity of a human mind, there isn't going to be any immediate resource contention issue; there'll be more than enough computronium to go round until the SI(s) have elaborated themselves to make use of resources far beyond anything one human mind might need. This _will_ take time. I'd therefore expect there to be a window of opportunity between uploading becoming possible and competition among SI's for computational resources impacting un-uploaded people. (A window that may well last for many years -- consider the timescale for converting the whole solar system into an MB, for example.)

Where's this paranoia coming from?