> Humans, when they are first created need spoon-feeding. They need very
> repetition of information to spot the basic patterns in it and learn to
> recognise things around them. Later on, they can learn in leaps and
> but at the beginning they learn slowly. It seems likely (to me, anyway)
> that an AI would start off learnnig slowly and then pick up speed as it
> along. Certianly, early AI's would learn slowly enough for this to be
In order to make human-equivalent AI we will have to have a larger knowledge base than any one research group is likely to want to code. Sharing knowledge bases between projects is perfectly feasible, and I expect it to become more common as AIs become more complex. The key point is that once you have a body of information encoded in one AI program, you can export it to another one.
> We also learn as fast as we can process data. Admittedly we don't force
> ourselves to learn as fast as we possibly could (most people aren't that
> enthusiastic), but we still spend months/years building those first
> of our neural network into recognizing basic objects and the
> between them and it's only years later that we become able to spot the
> detail of the relationships and can make accurate theories about them.
No, we don't. The human brain deals with a constant data stream of tens (maybe hundreds) of megabytes per second. You remember only a tiny fraction of that - maybe a few bytes per second if you're paying close attention, otherwise even less. An AI that can handle the same sensory data would be able to store a much larger fraction of it, because its memory operates in the same time scale as its CPU.
It also doesn't get tired, and it won't get bored or distracted unless you want it to.
Billy Brown, MCSE+I