Re: Chunking intelligence functions

From: Anders Sandberg (asa@nada.kth.se)
Date: Thu May 10 2001 - 02:40:38 MDT


torsdagen den 10 maj 2001 07:57 Eliezer S. Yudkowsky wrote:
> Spike Jones wrote:
> > > "Eliezer S. Yudkowsky" wrote: ...Minsky and
> > > Papert killing off the entire field of neural networks...
> >
> > Elizer, Im not up to speed on the state of the art in
> > neural nets, but I did notice there was not much said
> > about it in recent years after it seemed so promising
> > 12 yrs ago. Could you give us a one paragraph
> > summary on your comment?
>
> No, this was long before then. I think there was a paper in 1967 and a
> book in 1969, if I recall correctly. Essentially, Minsky and Papert
> proved that a two-layer Perceptron (which was what a neural network was
> called back then) couldn't compute the XOR function. Which killed off the
> entire, then-nascent field of neural networks. Eventually someone noticed
> that a three-layer neural network (with a total of five neurons) could
> easily compute the XOR function and the entire field came back to life.

Yes. the book _Perceptrons_ was published in 1969 and really put a damper on
the hype. However, Minsky and Papert did show that three-layer networks could
do XOR in the book, it was just that they were pessimistic about finding a
training rule for such networks.

(Why care for XOR? Because it demonstrated that there were simple operations
the networks could not perform; the book gives several other examples of more
behaviorally relevant things like parity and pattern recognition simple
perceptrons cannot do).

> I am oversimplifying slightly - what really brought the field back was
> backpropagation, which is what enabled the training of multilayer networks
> - but the fact still remains that Minsky and Papert's eulogy was
> incredibly premature and did a lot of damage. Nothing remotely on the
> scale of Margaret Mead, though.

Actually, I think it did a good thing. It made the field more cautious and
less hyped. The book is good science and hence far from the claims of Mead.
It was the rest of the field that should be blamed: in the face of adversity
it just caved in, and did not meet the challenge posed by Minsky in finding a
multilayer training algorithm.



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:04 MDT