Robin Hanson wrote:
> A simple neo-classical growth model examines the implications
> of machine intelligence, machines which could substitute for,
> rather than complement, human labor. Steady state growth rates
> could easily rise by an order of magnitude or more. But with
> steady growth, wages and per-intelligence consumption could fall
> as fast as computer prices do.
I shall leave aside my personal objections, since I think you may find Gubrud's arguments more convincing; they use essentially the same set of assumptions. While the main focus is on nanotech, it also deals with the effects of advanced artificial intelligence.
"Nanotechnology and International Security" http://squid.umd.edu/~gubrud/
(From the abstract:)
"Whereas the perfection of nuclear explosives established a strategic stalemate, advanced molecular manufacturing based on self-replicating systems, or any military production system fully automated by advanced artificial intelligence, would lead to instability in a confrontation between rough equals. Rivals would feel pressured to preempt, if possible, in initiating a full-scale military buildup, and certainly not to be caught behind. As the rearmament reached high levels, close contact between forces at sea and in space would give an advantage to the first to strike."
In the event that you've read it already, which seems probable, I would just like to say this: The flip side of rapid progress is incredibly destructive wars. And if you think the currency meltdown is causing global destabilization, the differential equations you blithely toss around would shatter national economies like glass. Since I don't believe a Weak Singularity is probable, I can say dispassionately that the Weak Singularity your paper models would probably end in the violent death of a significant fraction of mankind. The average human might be freed from the "stresses" of the working life and the necessity of making a living, but not to engage in a life of leisure. Judging by the Middle Ages, those who do not work for a living generally occupy themselves with wars of conquest.
-- email@example.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.