Re: Jaron Lanier Got Up My Shnoz on AI

From: James Rogers (jamesr@best.com)
Date: Wed Jan 16 2002 - 12:46:01 MST


On 1/16/02 11:13 AM, "Steve Nichols" <steve@multisell.com> wrote:
>
> The point is that there ARE NO ALGORITHMS in the brain. This
> "symbolisation" doesn't exist in nature, and in massively parallel
> DISTRIBUTED systems (not transputers or just multiple von Neumann
> processors) no algorithms or programs are fed into the system.

You have a very strange notion of computation. The brain does have
processes doesn't it? You either mean something very different from what
you just said, or you aren't making any sense.

 
>> There is literally no practical
>> difference between hardware and software. Obviously, though I don't see it
>> like you do either. Consider the following bits of information:
>
> Not only aren't there any 'programs' in mpd systems (just weight states!)
> but is also the case that internal representations (of back-propagation
> machines, especially combined wiv vector quantisation & learning
> reinforcement or IAC) cannot be meaningfully analysed. We just have
> input and output results (plus math models of the ARCHITECTURE).

Input and output results are all you need to duplicate ANY finite state
machinery. Universal FSM copying algorithms create structures that look
very similar to neural networks, not like the more conventional algorithms
that most people who work with software are familiar with. With "weight
states" and everything. These are programs/algorithms in every sense of the
word; just because you aren't used to seeing algorithms expressed in
something approximating an efficient Kolmogorov form doesn't mean that the
structure isn't an algorithm. I *am* used to looking at algorithms that
have been reversed engineered into these forms, and while very different,
proper analysis can be learned. (Weighted fuzzy networks are an efficient
approximation when insufficient memory nodes are available for a lossless
copy.)

You quite apparently misunderstand and underestimate current analytical
capabilities.

> Sorry James, but 'ordinary' von Neumann machines will never be sentient.

I hope you have a rational reason for that assertion.

 
> Forget finite-state hardware ... the distinction between virtual (phantom
> in MVT parlance) and "physical" (either E1-brain or silicon) is crucial
> in this debate since we are discussing 'felt' experience, not computation.
[...snip...]

Occam's razor would be drawing much blood from your theories.

Cheers,

-James Rogers
 jamesr@best.com



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:34 MST