Jon Reeves insightfully writes:
> The extermination (or enslavement) of several billion people would surely
> require an expenditure of a considerable amount of time and energy - even
> for an SI. What motivation could it have for doing so ?
Some of the scientists who build SIs may feel the project entails too much risk.
In order to satisfy the expectations of such SI workers, the SI may kill
everyone... leaving the queasy SI workers for last.
> It seems to me that most sapients consider diversity to be a very important
> thing - why would a A/SI not think the same. To keep humans around would
> require very little expense to the SI, while at the same time providing it
> with an enormous pool of intellectual diversity.
No, because the SI would find about as much stimulating diversity in human
relations as humans find in poetry written by monkeys.
> The notion that an SI would just wake up and decide to wipe out humanity
> strikes me as absurd - to do so would require tremendous arrogance, and I
> would hope that something supposedly more intelligent than us would realise
> how stupid it was being.
>
There's nothing inherently stupid about wiping out Homo sapiens sapiens
(especially if your friends need the space).
--J. R.
"Something beckons within the reach of each of us
to save heroic genius. Find it, and do it.
For as goes heroic genius, so goes humankind."
--Alligator Grundy, _Analects of Atman_
This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:31 MDT