On Tue, 27 Mar 2001, Damien Broderick wrote:
> Great! Now go and find the sequence of canned bits you need *right now*,
> using as many caching levels or whatever as you like. Quick, the impromptu
> poetry contest is about to be won by that toad Blackadder--
As much as some people acuse me of being a know-it-all, I will
submit that my lookup-tables are indeed incomplete. Sadly, a mere
order of magnitude greater capacity in lookup tables suffices to
drive the average "Wayne" onto his knees praying, "We are not worthy"...
It is unfortunately, going to get much much worse.
> Sorry, I'm taking some shortcuts through the lookup table. Nearly 50 years
> ago, Noam Chomsky (it is widely held) showed that finite-state grammars
> were unequal to the task of explaining productive, creative human speech,
Damien, you of course exemplify this to the n'th power. I am
wondering however if this theory would not need reexamination
in light of the parallel processing, perhaps and Darwinian selection
that we know or suspect occurs in the brain?
Most computer languages that I know of are 'finite'-state grammars
but allow the production of arbitrarily complex 'statements'.
(someone correct me if this I'm being imprecise.)
This needs a more rigorous explanation and/or citations if I am
to believe this. (I've heard Chomsky speak (on a totally unrelated
political topic) and respect him, but I also know that his perspective
is under some questioning in academic circles currently.)
> [snip] But that doesn't mean the templates come pre-stocked, far from
> it. Input and output require *generative* processes that Chomskyans would
> say exceed any look-up table that we could realistically anticipate (in
But we are not requiring the zombies to "generate" our reality.
We only require them to 'adhere' to it sufficiently that it causes
the reality to follow along paths that are sufficiently accurate
to satisfy the needs of the simulation. I.e. zombies, only have
to be 'perceived' to be real by the individuals for whom the
simulation is constructed. The lookup table for "Honey, I love
you" -- "Well darling, I love you too." *isn't* very big!
> As I understand it, models of cognitive function these days assume at least
> two classes of storage and access: some lookup tables or something like
> them (where the brain checks the index and goes straight to the answer),
> and some constructive imagery procedures where, eg, one wanders around on
> an imaginary map looking for the answer (the time taken being a linear
> measure of the amount of `mental space' walked through).
I think this may be tied to something concrete with spatial
representations of familiar objects. I think its been determined
that on average you store 3 representations for an object (e.g. a fork).
You then spatially transform an object (presumably in parallel)
and match it against the database of stored images (also presumably
in parallel) to 'recognize' the object as a fork. The degree
to which to object is in an 'uncommon' orientation relative to
the stored images determines the recognition time.
> I'm uncertain how Freeman/Hopfield-style attractor models fit into this:
There could quite easily be a feedback loop in that the lookup
of the data "asserts" that it has a good match, which may
encourage a reformation of the input to better match the
stored data causing a cascade up to the level of consciousness
that says "yes, indeed, that is a fork". This quite nicely
explains the 'incorrect' recall of witnesses where they match
their 'so-called' memories to their expectations.
So perhaps to deal with the zombie instantiation properly
we do not have 'exact' lookup tables, but instead 'fuzzy'
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:43 MDT