Crosby_M (
Fri, 7 Mar 1997 12:23:06 -0500

On Thu, 6 Mar 1997, I wrote:
<Human minds are not *merely* neural nets, even though neural nets may
be an appropriate model for portions of the human mind.>

Robert Schrader then asked:
<Please elaborate. What kind of part are these non-neural-net parts?
And how would you model them?>

Sorry, I don't really know. The point I was primarily trying to make
was that something that appears to be an inessential artifact at one
level can be appropriated as a driving force at another level. When
we do get a grip on the mechanisms involved in the *linking* of
different hierarchical and relational levels in complex systems then I
think we'll have a better model for such things as creativity than we
have with uni-level 'hardware' models like Turing machines or basic
neural nets.

I was just reading a 9701 _Communications of the ACM_ article on
"Artificial Intelligence and Virtual Organizations". I like their
concise definition of ontology: "Ontologies are specifications of
discourse among multiple agents in the form of a shared vocabulary."
(This is sorta what Gregory Houston was recently talking about - on a
semi-synchronous thread - as a precondition for close friendships,
while Kathryn Aegis, on the other hand, was talking about friendships
being based more on shared commitments than on common interests.)

Anyway, this CACM article went on to describe some of the specialized
agents that are needed in a 'virtual organization', e.g., domain
experts, wrapper agents (to translate outputs from the domain experts
to the shared vocabulary), facilitator or broker agents to provide a
reliable communication layer between the specialized agents.

The human brain also has similar specialized and generalized organs.
So, I'm trying to find a way to say that /even if/ the underlying
hardware or wetware is neural nets all the way up and down, at a
certain level of analysis the functional elements and their roles in
the overall ecology (e.g., the perceptual and intentional dynamics)
are able to encode things that you won't see if you look only at the
underlying mechanisms.

Mark Crosby