> I do see the force of the claim, but it might still be wrong, since
> emergence of effective AI might, after all, depend on ramping up the
> substrate technologies by brute force via Moore's law.
> Damien Broderick
Emergence of AI will depend, I think, on heuristics and programming. More than
half the humans on Earth have IQs in the 100 range or above, but only a tiny
percentage of humans can do calculus. This means the bottleneck is in the
software, not the hardware. It's easy to breed Homo sapiens, but difficult to
teach them to run factories, design space shuttles, and write code. Similarly,
supercomputers can outperform humans in raw computing power, but no one yet
knows how to teach them to learn on their own. Recent research indicates that it
may well happen that massively parallel neurocomputers will figure out for
themselves how to learn what they need in order to be considered
"We are looking into principles of learning and self-organization that allow a
system to develop and become intelligent by itself. A major premise in biology
is that there is structure in the world and this structure is recognized by the
brain. Based on this structure, the brain can bootstrap itself, through learning
and self-organization, to become better and better.
"This is the process that we would like to understand. It's clear this can't
happen from nothing, so what must be happening is that the genome is giving us
some information about how to learn -- what the right learning algorithms are
for the problems that the system is facing. And we have to figure out what the
basic ideas or these biases are which come from the genetic coding. But from
there onward we believe that the brain can self-organize.
"this is a different way of thinking. It is basically, in the end, a huge neural
network that self-organizes automatically to reach this level of intelligence
and competence that we see in biological systems."
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:15 MDT