Any system that can't think its way out of a local optimum has
insuffient reflectivity or self-awareness to be successful as a seed AI.
If the system can't say, "Gee, I guess I'm at a local optimum, better
think my way out of it," then it probably lacks the basic cognitive
faculties to understand its own design at all, much less improve it.
--
sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with PatternsVoting for Libertarians Heading for Singularity There Is A Better Way