>From: Christopher McKinstry <email@example.com>
>Just one more point before I go off to read your 'Friendly AI'... if
>Kurzweil is right, and in the future I can scan my personality into a
>computer, the event will create an instant conflict simply because the
>copied version of myself will fight to the death not to be turned off by
>the original version. I would be just as friendly as I am now to my
>fellow 'virtual' humans (as long as I could verify they were virtual),
>but I would see 'real' humans as potentially very dangerous to my
>continued consciousness. No matter how friendly I am, I would have a
>very strong objection to my reality being externally controlled. Bad
>things will happen unless we can develop some form of protocol of trust,
>which I am not confident we can.
I'm a couple days behind in reading this list, so the points below may
already have been raised. If so, apologies.
1. Why would the original you be a threat to the virtual you? What would
the original have to lose by the continued existence of the virtual?
2. Even if the original were highly motivated to exterminate the virtual,
wouldn't that be murder?
Life is good. Refuse to die.
Get your FREE download of MSN Explorer at http://explorer.msn.com
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:39:43 MDT