Zero Powers wrote:
>
> Seems to me any super-intelligence would have to have virtually zero
> curiosity to sit around granting human wishes and obeying human orders for
> lack of having any other interests worth pursuing. And, it seems like a
> super-intelligence without curiosity would be impossible to engineer.
> Unless it has some internal motivation to seek out knowledge on its own, all
> its knowledge would have to be force-fed.
Sure. The motivation to seek out knowledge is so that said knowledge can be
used to better grant human wishes.
But if you mean, does high intelligence mean that you automatically get a
surge of "pleasure" every time you see a new solar system, I would have to say
no. I don't think that minds-in-general need pleasure in the first place.
Just goals and the ability to model reality are enough.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:16 MDT