Re: Keeping AI at bay (was: How to help create a singularity)

From: Jim Fehlinger (fehlinger@home.com)
Date: Sun May 06 2001 - 17:36:47 MDT


James Rogers wrote:

> I wrote:

> > Don't matter what kind of software you write, if
> > you don't have big enough iron to run it on. And what, precisely,
> > will constitute "reasonable" hardware for this particular job?

> Not relevant. AI is AI is AI, even if we don't have the hardware to run it
> fast enough to be useful.

I **knew** somebody was going to say that. I picked something
off the soc.motss quote page that's apropos:

In theory, there is no difference between theory and
practice. In practice, there is.
 -- Muffy Barkocy

However, the argument's gone a bit off the rails here, in that speed
isn't the **whole** story. Your original question was (paraphrasing)
"what the hell does high-plasticity, or evolvable, hardware have to do with
making an AI?". Here's my take on this (my standard disclaimers apply here --
I don't have Eliezer Yudkowsky's brains, Eugene Leitl's degrees, or
Damien Broderick's seriousness of purpose in keeping up with the field in
order to write books about it [not to imply that Eugene and Damien don't have
brains, or that Eugene and Eliezer don't have seriousness of purpose, or...
well, you get the idea!]).

**If** you buy the selectionist arguments of Edelman, Changeux, et al.
about the basis of intelligence in animal nervous systems (maybe
you do, maybe you don't, or maybe you haven't thought about it; but
that's where I'm coming from, just so you know ;->), then there's something
special going on in biological brains (and bodies; they're really not
separable in this model) that cannot be simplified away (and it's nothing
mystical -- Edelman himself pooh-poohs Penrose's invocation of quantum
mechanics, or what Edelman calls "exotic physics"). This "something
special" -- competition and selection among the members of very large and
stochastically-varying sets of active elements (read the authors for
a better description) is probably accomplished directly in "hardware"
in animal brains. As much **stuff** as there already is in the human
brain, if you're going to replace this squishy hardware substrate with
what Eliezer would (figuratively) call a more "crystalline" hardware substrate,
and **simulate** all that dynamic whoop-de-do in software (utilizing
some source of randomness to provide the variability among
the competing elements -- this is what it makes it different from
a Turing machine, sez Edelman), then that more regular and static
hardware substrate is going to have to have even **more** "stuff" in
it (I don't know how many orders of magnitude, but more than one,
I'd guess) than the already-stuffed biological brain does (just like
a CD player has to have more stuff in it to do basically the same
thing that an analog LP record player does).

> I don't think you understand the problem fully...

See above.
 
> > It ain't gonna be silicon, I can tell you that...
>
> What then?

I don't really know. Eugene Leitl probably has a more informed
opinion about this. Fullerene nanotubes have been getting a
lot of press lately. Diamondoids out of Drexler have been
mentioned. Maybe biological tissue -- if it can immortalized,
though I know some sort of (literally) crystalline gizmo has more
psychological appeal for a lot of body-hating techno-nerds
(like me!) -- like those crystals that everything on the
planet Krypton is made out of in the Superman movie (I
just bought the DVD ;->). It would be nice to be able to
use something that can survive harsher environments (like
deep space) better than biological tissue can, and also
something that could be made to operate at higher speeds.

Jim F.



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:03 MDT