>From: "Michael S. Lorrey" <retroman@turbont.net>
>
> > But what else (if anything) will motivate them? Will there be any
>"good"
> > for them other than information? Any "evil" other than ignorance? Will
> > they care at all about such trivialities as emotion, fairness,
>compassion
> > and pain? Whether or not I want to survive the ascendancy of strong EI
>will
> > depend largely upon this question. Unfortunately I'll never know the
> > answers unless and until I survive to that time. Or unless I am
>persuaded
> > by the musings of this list. I can't wait to hear your thoughts.
>
>Based on my arguments above, I think that since uploaded humans will
>continue to
>think of themselves as human, that their motivations will be very similar
>as
>they are now, there will merely be increased growth and maturity in an
>uploaded
>humans thinking. Because the uploaded human will have greater access to
>information and capacity to make rational decisions, human society will
>become
>closer to the Baysean ideal, so less strife and stupidity will of course
>occur
>(except that by those who refuse to augment themselves, of course),
>although I
>don't know if it will dissapear entirely.
So do you anticipate that strong EI (or AI if you prefer) will not precede
uploaded human minds? It seems to me (granted I am *not* a scientist) that
we are much closer to EI than to uploading brain patterns. Also seems to me
that once EI becomes strong enough it will be able to "take over." At that
point we may very well not be able to upload our minds without the
permission of the EI running the show. From your comments it seems that you
feel strong EI will only come after, or even as the result of, human mind
uploading?
-Zero
______________________________________________________
Get Your Private, Free Email at http://www.hotmail.com
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:04:03 MDT