Re: No AI for Nano/No Nano for copyloads

From: Robert J. Bradbury (
Date: Sat Jul 15 2000 - 07:08:54 MDT

On Fri, 14 Jul 2000, Robin Hanson wrote:

> wrote:
> > with nanotech, ... The relative rates of progress of these different
> > technologies all change significantly, which gives you a very different
> > world
> I'm not sure we know much about how nanotech changes the relative rates of
> progress.

I have to agree with Robin. Nanotech is simply the enabling technology,
what we get out of it depends on where we put our money and energy in
designing the future. One of the things I'm seeing out of this discussion
is the degree to which we *can* steer this direction. We can end up
being a Matrioshka Brain controlled by a monolithic somewhat demented
megamind that does nothing but quantum gravity calculations all day
or we can end up as a highly diversified polymorphic culture of nanotech
entities each operating at a level and interfacing to various virtual
realities as we so personally choose. Quite different pictures and
probably determined in part by chaotic events we cannot control and
in part by things that we do control such as investment choices and
the "enrollment" of others.

> I don't think you understand the economic argument here. Even ignoring speed
> advantages of uploads, since it is cheap to create uploads, the supply
> increases quickly, which lowers the market wage. I did try to explain
> this stuff at

Robin, I haven't read this recently, but I think it was written before
much of the Internet economy arose.

I've seen arguments that the's tend to drive themselves out of
business at an ever increasing rate. This is partly because the
expectation of huge free give-aways up-front to capture part of
the market segment (cell phones) and because things like software agents
should drive the profit margins down to ~0 at which point the company runs
out of capital and goes belly up (e.g. Amazon potentially).

Don't cheap copies only do the same thing for labor market?
You argue that the copies become cheaper than humans. Ok fine.
But isn't the problem that the copies compete against each other
to drive the profit out of their specific specialty? Shouldn't
the cost of labor fall to approximately the level of the amortized
cost of the hardware plus the energy required to operate it?

Since labor is what is required to build increasingly more efficient
hardware (the self-evolving AI model), doesn't it rapidly spiral
to the point where you have reached the physical limits? And
isn't this just another way of looking at the singularity?

It seems for your model to be valid for very long you have to
put some constraints on the rate of invention approaching the
physical limits.


This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:34:37 MDT