In _Coding a Transhuman AI_, I argue that a useful system of symbols and memory requires context-sensitive reconstruction from (some presumably more compact) coding, not just reloading a precise recording of the last set of data structures. For example, if "cat" just loads in a picture of every cat I've ever seen, how do I imagine a "purple cat"? And once that infrastructure is in place - in fact, I suspect that the infrastructure evolved that way to begin with - the tremendous storage capacity necessary to store precise images doesn't seem worth the evolutionary candle.
The legendary theory that all memories are stored subconsciously seems to me like obvious bull. I recall, for example, the experiment showing that (gaussian) women are better than (gaussian) men at memorizing the location of objects, either relative or in spatial arrays; 70% better if you're testing casual, rather than deliberate, memory. I'm not getting into gender politics, but I do want to point out that a differential selection pressure implies a selection pressure, and one strong enough to get a 70% performance improvement; if we really have perfect memories stored, why haven't such selection pressures resulted in perfect performance?
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way