Re: >H RE: Present dangers to transhumanism

Cynthia (cyn386@flash.net)
Thu, 09 Sep 1999 16:58:14 -0700

Doug Jones wrote:

> This is where Eliezer's doubled doubling breaks down- smart AI's might
> optimize their own code, but faster execution requires faster hardware,
> which is tied to a physical realm where Moore's law is difficult to
> shortcut. An AI singularity can't occur unless the time constant for
> hardware improvement and replication can also be shortened.
>
> (Massively parallel processor systems can grow in processing power only
> linearly. Even if more chip fabs are brought on line, it's the rate of
> creation of chip fabs that limits progress.)

Going parallel is the next step, but beyond that their are more steps. Such as going analog. Analog isn't flexible, but it sure is fast.