Re: Upload rehearsal?

Anders Sandberg (nv91-asa@nada.kth.se)
Fri, 8 Nov 1996 17:56:49 +0100 (MET)


On Wed, 6 Nov 1996, Crosby_M wrote:

> In these recent discussions, no one appears to considered the 'alternative'
> of 'ER', the type of things that Alexander Chislenko has written about
> ENHANCED REALITY at http://www.lucifer.com/~sasha/EnhancedReality.html

Actually, this is a very good point. Both as a rehearsal and a way to
start your gradual uploading (I plan to put together my own first system
quite soonish, in fact).

> The BIG problem with this scenario, however, is that it might kill the
> sacred cow of "H. accelerans". The need (under ER) for real-world
> interaction and the desire to make use of *all* available substrates for
> data storage, communication and computation makes it unlikely that
> gradually uploaded, or augmented, minds would be much faster than analog
> meat minds (at least initially). Also, an ER-based mind would likely want
> to use distributed sensing and computing components.

I don't think H accelerans becomes impossible with this scenario, just
different. The brute force HA would just be a human with a mind
accelerated significantly, while a person using advanced ER could have a
myriad search agents, remembrance agents, planners, visualizers and
interfacers running very quickly, feeding him with information and
suggesting what to do. If the user took the plunge, they could even
(assuming sufficient sophistication) act for him, creating a fast, dynamic
exoself where the human is the main core for processing values and
long-term goals (supported by a motivational co-processor?). This is
actually quite similar to how consciousness seem to act: a slow, rather
limited control system that influences the much faster and powerful
unconscious subsystems.

> BUT, a serious
> disadvantage is that a gradual uploading approach, such as that proposed by
> Algimantas Malickas, might not be able to capture *all* of the memories and
> emotional/value preferences that might be burried in deep neural structures
> because, as Malickas admits:
> <The gradual uploading would be uploading of mind functions, not uploading
> of a morphological brain architectures.>

Yes, this may be a serious problem (especially if you have the "wrong"
views on personal identity; people like me who identify with their memes
and behavior have less trouble with this). If the brain dies/is removed
before enough has been transmitted, we just have a diffuse system with no
motivation or initial memories. What can be transmitted seems to be mainly
memes and what we can encode.

-----------------------------------------------------------------------
Anders Sandberg Towards Ascension!
nv91-asa@nada.kth.se http://www.nada.kth.se/~nv91-asa/main.html
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y