"Robert J. Bradbury" wrote:
> If a huge fraction of what I do is *not* a brute force pruning
> of a search tree, then I don't know what it is. So in that
> respect Deep Blue and I are quite similar.
The vast majority of what you do is not a brute force pruning of a search tree. The vast majority of what you do is either conscious, self-aware cognition, the execution of reflexes derived from previous cognition, or the execution of instincts, emotions, and other perceptual and cognitive processes which, though "autonomous" in the sense of not requiring conscious direction, are nevertheless vastly more sophisticated than Deep Blue. For one thing, they have a *lot* more raw power.
> The nature of much of what Deep Blue did would have required
> little cognition. For the purpose for which it was designed,
> the cognition part can be done by the position evaluation software.
> Essentially the "Where am I?" or "How am I doing?" parts of
> its code. Now if you had Deep Blue A & Deep Blue B
> playing each other, and in each you created a feedback loop
> between the assesment of the board positions and the parameters
> involving search depth, pruning, and selection criteria
> then I would argue that you have something very close to "AI".
I'm not sure how you're defining "AI" here, but such a process certainly wouldn't be "intelligent". It would not be creative, self-modelling, or capable of representing general content. It wouldn't have goals or a world-model except in the same way a thermostat does. Deep Blue doesn't know that it knows.
-- email@example.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way