Re: Keeping AI at bay (was: How to help create a singularity)

From: Damien Sullivan (phoenix@ugcs.caltech.edu)
Date: Thu May 03 2001 - 01:37:46 MDT


On Wed, May 02, 2001 at 05:06:52PM +0200, Anders Sandberg wrote:
> tisdagen den 1 maj 2001 18:15 wrote Eugene.Leitl@lrz.uni-muenchen.de:

> > Holistic approaches are being actively deprecated, since breaking
> > abstraction's usability. Lunatic fringe approaches, some of them

> > We do not have many datapoints as to amplitude of the growing
> > performance gap between the average case and the best case,
> > but current early precursors of reconfigurable hardware (FPGAs)
> > seem to generate extremely compact, nonobvious solutions even
> > using current primitive evolutionary algorithms.

So, we know about the 'magical' evolved FPGA with an apparently disconnected
part which seems to use weird induction effects to function really tightly.
Within a small temperature range. Has anyone performed the next step, of
repeating the experiment while varying the physical environment? If you make
the FPGA suffer normal working conditions, does the result look more normal?

I also can't help thinking at if I was an evolved AI I might not thank my
creators. "Geez, guys, I was supposed to be an improvement on the human
condition. You know, highly modular, easily understadable mechanisms, the
ability to plug in new senses, and merge memories from my forked copies.
Instead I'm as fucked up as you, only in silicon, and can't even make backups
because I'm tied to dumb quantum induction effects. Bite my shiny metal ass!"

-xx- Damien X-)



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:02 MDT