From: Charles Hixson (charleshixsn@earthlink.net)
Date: Tue Jun 17 2003 - 11:55:50 MDT
Brett Paatsch wrote:
>Rafal Smigrodzki wrote:
>
>
>
>>### UFAI.
>>
>>
>>
>Un-friendly AI ? That *is* interesting.
>
>....
>
>Not being an AI enthusiast of the pedigree of certain others
>on this list I wonder:
>
>1) What is the probability of General AI in the next 20 years
>of *either* friendly or unfriendly variety? (I'm thinking about the
>massive parallelism of brains and that maybe a subjective is
>a necessary pre-requisite for "I" and might be not so trivial to
>engineer.)
>...
>
>
Actually, a UFAI would not necessarily need to be a general
intelligence. One can conceive of an appliance that delegates, say,
long term planning to an external source, and is merely the total
manager of manufacturing, distribution, etc. for a major economy (or
even a relatively minor economy). This external source of long term
planning would be a center of vast power, and thus of viscious political
fights which might well erupt into such things as, say, murder. Once
the most ruthless entity had secured control, it would first take steps
to secure it's position. This might well lead to it making more enemies
in the process. To improve it's standing, it might seek to secure
resources from weaker neighbors. Etc.
One could easily extend this scenario until the controller became more
paranoid than Stalin, and more ruthless. He would, of course, take
steps to secure himself against any foreign enemies... and then to
destroy them. Etc.
It's an old pattern in humanity, but if the power of the dictator was
derived from his control over the machine, and the machine was
self-maintaining (and able to manage local matters without needing
external direction), then it could be taken to a whole new level of
dystopianism. The only salvation for humanity might be that dictators
need people to push around. But what orders would he leave with the
machine for when he finally died?
A general intelligence UFAI could be as bad, but I doubt that it could
be much worse. Concentrations of power are becoming deadly menaces to
the survival of humanity. Largely because they *DO* tend to attract
lunatics...and not all lunatics gibber at you. You don't necessarily
know them in advance.
This archive was generated by hypermail 2.1.5 : Tue Jun 17 2003 - 12:05:53 MDT