I was reading some of the previous debates on the Singularity in the list
archives recently, when it struck me that there is a major factor that does
not seem to have been seriously considered.
Simply put, the more advanced a technology becomes, the more work it takes
to improve it. As technology advances there is a general tendency for
everything to become more complex, which means more work for the engineers.
IMO, this principle has several important implications:
First, it means that advanced nanotechnology is not possible without major
breakthroughs in automated engineering and/or intelligence enhancement. Why
not? Well, diamondoid parts might be simple repeating structures,
Second, a self-enhancing AI can't expect to optimize its way into an SI
unless it has SI-level hardware to run on. It might, if it is very lucky,
but it is more likely to proceed in a series of sharp upward jumps separated
by lulls while it waits for faster hardware. You probably need nanotech to
build the computers to run an SI, and you can't design the nanotech unless
you are pretty close to being an SI anyway.
Because of these factors, a Singularity is likely to have a slow takeoff.
You may have sudden jumps (when the first sentient AI goes online, or the
first general-purpose assembler goes into operation), but each improvement
simply leads to a new plateau while you wait for the rest of your tech base
to catch up. The open-ended, geometric nature of the critical
Billy Brown, MCSE+I
bbrown@conemsco.com