> In order to build useful siliconeural interfaces, you need some idea of
> what the brain's internal coding system is. To build true telepathic
> interfaces, you need to decode high-level cognition and concepts, things
> that live in the prefrontal lobes of the cerebral cortex, not just
> relatively "easy" encodings like the visual cortex or the highly modular
> and localized emotions of the limbic system. I don't know what the
> high-level codes are, but I can tell you where to start looking.
I find the notion that there exists some universal 'language of mind' encoding system underlying cognition extremely implausible. It seems much more likely that each individual's representation system is highly idiosyncratic. If everyone's high-level codes are different, the prospects for wire-to-brain interfaces seem bleak.
> Certain thoughts - certain complex intellectual structures, presumably
> distributed all over the frontal cortex - reliably create certain
> emotions. This is what I call a semantic binding, and it's an amazing
> thing, when you think about it. Imagine translating the thoughts into
> the kind of propositional-logic semantic networks LISP-based AIs use -
> like, "Bob broke my ribs" to "hurt(person72, me)"; which, in one of
> those AIs, really translates to "G023(G052, G187)".
> (Yes, Eugene Leitl, I know the mind's not a semantic network; that is,
> in fact, exactly my point.)
> How does the mind know to bind the thoughts to the emotion of anger?
> How can the limbic system reliably single out symbol G023? If you look
> at the modular and simpler limbic system, and look at the codes it uses
> to interface to high-level cognition, you'll find a set of regularities
> that exist reliably, and you can use that for the Rosetta Stone of the mind.
> Once again, it all boils down to my own little specialty - the interface
> between emotions and cognition.
If you could determine what emotions a thought evokes in me and then stimulate those same emotions in you, you will have accomplished a great deal, but you'll still have missed most of that thought's semantics. How, for instance, could the semantics of a thought like 'He's taller than me.' be reduced to feelings?