Re: Constraints on the "singularity"

Michael Lorrey (
Sun, 12 Oct 1997 19:36:41 -0400

Ramez Naam (Exchange) wrote:
> > From: []
> > I use the term "singularity" in deference to Vinge, who is a professor
> > of mathematics. It's only a qualitative analogy, however: the actual
> > simplest equations that describe the simplest qualitative predictions
> > don't actually have mathematical singularities.
> This is an important point that seems to be often ignored in discussions
> of the "singularity". While a simple interpolation and future
> projection of (for example) available computing power may show a
> vertical asymptote, the underlying equations that govern the rise of
> computing power are subject to constraints that may result in a
> flattening of the curve.
> I'd be very interested in seeing (and would write myself, if time and
> knowledge allowed) an analysis of what a post-"singularity" plateau
> might look like, given the most stringent constraints we know of.
> Obviously these constraints are products of our current understanding of
> the world, and thus may be suppressible given some future physics &
> mathematics, but I think it a bit disingenuous to use a term like
> "singularity" when our best current models of the world show bounded (or
> at least, merely exponential) growth.
> There are much wiser heads out there on this list, so let me just start
> off the conversation with a few constraints that I can see (again, given
> our current physics & mathematics):
> Size Constraints: Our current understanding of QM essentially
> constrains the precision with which we can build physical structures (or
> direct energy) via Planck's constant, thus creating an apparent lower
> bound on our engineering ambitions. This in turn has consequences for
> Moore's Law, as to date progress has been made largely via finer and
> finer lithography techniques.

By a recent post, use of quantum cellular designs, they operate better
the smaller they are, could be as small as a few atoms, with a single
layer of atoms between cells, and 5-10 layers between circuits. They
also minimize heat and power consumption, another problem with current
chip circuitry. How does this scale to current chip circuit sizes?

> Speed Constraints: Barring some revolution to Relativity, c limits the
> rate of expansion of even a Vingean Power. In addition it limits the
> speed of signaling within that entity's "brain". Again suggesting
> possible constraints on the rate of computational growth.

Well, what is the speed of chemical reactions across the brain, and from
one end of the nervous system to the other? Whatever time lapse this is,
scaled to C will give you an estimate of the size of the beast you are
contemplating, for at least human capacity intelligence. Then again, if
we are to take the word of some marine biologists, we need to think more
on a scale of dolphins or whales nervous systems as baselines.
> Chaotic Computability Constraints: The most ambitious nanotech
> scenarios posit universal assemblers that can be programmed or designed
> to build specific structures. IMHO this is a clearly chaotic system.
> The structure of any complex object created by this sort of nanotech
> would seem to display an exquisite sensitivity to tiny variations in
> design of the assembler (or the assembler "software"), and possibly to
> the local environment itself. I question whether we'll be able to
> design assembler protocols for very complex objects through any means
> other than trial and error, which have their own pitfalls (e.g., grey
> goo).

Considering we are now playing with molecular sized motors, gears, and
circuits in the lab, with our rather ham handed tools that dampen our
movements down to a scale necessary to move actual atoms around, once
simple and functional machines are actually built that can run, and can
be built repetitively and reliably, then we will have brought the
industrial age to the nano scale, and can expect a progression curve at
the nano scale much like out own from the early 1800's onward. The
advantages we now have is that we already know how things work best at
macro scale, so its not like we have to totally reinvent anything, we
already know whats possible.
> (This is not to say that I'm skeptical of nanotech. I think the more
> prosaic nanotech scenarios have an enormous potential to revolutionize
> our world. If we can design nanomachines to construct individual parts
> of more complex systems, for example, and then use macro-machines to
> assemble them into the finished product, we will have gone a long way
> towards eliminating scarcity.)
> I'm very interested in thoughts from the list on:
> 1) Where a Spike subject to these constraints will begin to level off.

I don't think that we will see any significan leveling off. There will
be short periods perhaps of slowing growth, but as established
technologies reach their peaks, newer ones will come to the fore as they
become economically feasible, like a multistage rocket dropping off
spent boosters.

> 2) Just how severe and insurmountable these constraints are. Will
> superstring-engineering allow us to bypass Planck's constant? Will some
> new, higher-level theory of complex systems provide more computationally
> efficient means of simulating chaotic systems? Will quantum computing
> have any benefit here? Can a sufficiently advanced entity construct a
> pocket universe where Planck's constant and c are different from our
> own? Is there any way to communicate between that universe and this
> one?

I would guess that the closer the physical laws in a pocket universe are
to our own, the easier communication will be.
> cheers,
> mez

			Michael Lorrey
------------------------------------------------------------	Inventor of the Lorrey Drive
MikeySoft: Graphic Design/Animation/Publishing/Engineering
How many fnords did you see before breakfast today?