>From: "Harvey Newstrom" <mail@HarveyNewstrom.com>
>Eliezer S. Yudkowsky wrote,
> > That said, their AI theory most certainly appears to suck, and I would 
>put
> > their chance of passing the Turing test in ten years at zero, unless the
> > organization shifts to a completely new theory.  My guess is that the
> > incoherent babblings they've achieved are every bit as dumb as they
> > appear, and not simple cases of something that could scale up to greater
> > complexity.
>
>I was disturbed by their examples, as well.  They claimed that they were
>nonsensical in the sense of a child's ramblings.  That did not appear to me
>to be the case.  Children do not randomly make non sequitur.  They play
>games, have imaginary friends, and change topics frequently.  But they do
>not accidentally pull up the wrong information from their brain and spew it
>out in answer to the wrong question.  I also didn't understand the concept
>of the computer wanting bananas on a trip.  Unless this is an android that
>simulates eating and has taste sensors, this is meaningless.  How did the
>computer "learn" to "like" bananas?  If it can't eat, how does it "like"
>them.  This does not compare to a child who likes bananas.  I found the
>examples to be more counter-examples of AI.  (Unfortunately!)
The impression I got was that it wanted to take bananas for the monkeys, 
because it had learned that monkeys like bananas.  I could be wrong tho.
-Zero
"I'm a seeker too. But my dreams aren't like yours. I can't help thinking 
that somewhere in the universe there has to be something better than man. 
Has to be." -- George Taylor _Planet of the Apes_ (1968)
_________________________________________________________________
Get your FREE download of MSN Explorer at http://explorer.msn.com/intl.asp
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:40:12 MDT