Re: Jaron Lanier Got Up My Shnoz on AI

From: James Rogers (
Date: Fri Jan 18 2002 - 09:58:44 MST

On 1/17/02 8:05 AM, "Steve Nichols" <> wrote:
> But the brain processes are nothing like von Neumann/ finite state computers!
> For a start the synapses can rewire (in hardware, infinite-state!) in response
> to situations & problems. Do you deny this? And what structures in the brain
> resemble anything like CPU or RAM chips? Not to be found, because brains and
> brain processes are nothing like ordinary computers (doh).

They human brain doesn't have to operate like modern silicon, only be
mappable to it. There are some basic mathematical criteria for what can and
can't be mapped to silicon, and the brain definitely looks like it can be
mapped. It is utterly irrelevant if it uses very different hardware because
the results are all that matter.

>> Input and output results are all you need to duplicate ANY finite state
>> machinery. Universal FSM copying algorithms create structures that look very
>> similar to neural networks, not like the more conventional algorithms that
>> most people who work with software are familiar with. With "weight states"
>> and everything.
> What about hidden units????????

Hidden units are again irrelevant. If they don't modify the input at the
output, then they are computationally null. If they ARE modifying the
output, then they will be modeled automatically just like everything else.
What is the value of a hidden unit if it has no measurable impact on the

> All serial computers can do is model or simulate mpd function ... the
> lock-step mechanism cannot be replicated exactly in serial because in parallel
> all nodes are updated AT ONCE, this is why it is called parallel, whereas
> lock-step simulation in serial just does one process at a time, linearly. And,
> again, what about hidden units?????

There are an infinite number of algorithms that can implement any process.
Inputs mapping identically to outputs is the definition of an equivalent
algorithmic form. Hardware and software architecture are irrelevant.

Incidentally, modelling a parallel process done serially checkpoints at the
same exact state as the truly parallel model. The only difference between
parallel and serial in this case is that the state transitional steps are
computed differently. Therefore, the results are computationally
indistinguishable. It may go against whatever intuition you have, but if
you actually study the problem you'll find that parallel and serial
computation are computationally equivalent and interchangeable.

> Not only just very different, but nothing like! You can't go the other way and
> extract the "algorithm" being used by brains because the brain changes its
> wiring, not just weight states. Also, how can u reproduce graceful degradation
> in serial? If the bit stream is disrupted u r f**c**d!

Again, the computational process is irrelevant. Equality of outcome
demonstrates equivalence.

> All you can get from improved serial is more speed ... but speed of
> calculation has absolutely zilch to do with sentience/ felt experience. My old
> z80 is slower by a magnitude than a P4, but is not a jot more or less
> sentient!

While I'm not necessarily disagreeing with you, can you prove that your P4
isn't more sentient than your Z80? What is your criteria?

Silicon is just the computational machinery; the software is way more
important. The human brain is what would be considered software implemented
in hardware. It doesn't have to be that way but nature apparently has
difficulty evolving proper von neumann machines.

> I await your (or anyone's) manifesation of a finite-state sentient computer
> with baited breath (yawn), but meantime will carry on with my own work.

Good for you. As Eugene would say, nothing beats a killer demo.

> And by
> the way, I don't think that standard neural nets as they exist now are
> sufficient for sentience either, though are a necessary part (they are from
> work on reverse engineering of the brain after all).

I'm not a big fan of neural networks. They are inefficient approximations
of generalized forms as apparently implemented in the brain. Worse yet, our
software implementations are bad approximations of the wetware.


-James Rogers

This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:35 MST