By a recent post, use of quantum cellular designs, they operate better
the smaller they are, could be as small as a few atoms, with a single
layer of atoms between cells, and 5-10 layers between circuits. They
also minimize heat and power consumption, another problem with current
chip circuitry. How does this scale to current chip circuit sizes?
> Speed Constraints: Barring some revolution to Relativity, c limits the
> rate of expansion of even a Vingean Power. In addition it limits the
> speed of signaling within that entity's "brain". Again suggesting
> possible constraints on the rate of computational growth.
Well, what is the speed of chemical reactions across the brain, and from
one end of the nervous system to the other? Whatever time lapse this is,
scaled to C will give you an estimate of the size of the beast you are
contemplating, for at least human capacity intelligence. Then again, if
we are to take the word of some marine biologists, we need to think more
on a scale of dolphins or whales nervous systems as baselines.
> Chaotic Computability Constraints: The most ambitious nanotech
> scenarios posit universal assemblers that can be programmed or designed
> to build specific structures. IMHO this is a clearly chaotic system.
> The structure of any complex object created by this sort of nanotech
> would seem to display an exquisite sensitivity to tiny variations in
> design of the assembler (or the assembler "software"), and possibly to
> the local environment itself. I question whether we'll be able to
> design assembler protocols for very complex objects through any means
> other than trial and error, which have their own pitfalls (e.g., grey
Considering we are now playing with molecular sized motors, gears, and
circuits in the lab, with our rather ham handed tools that dampen our
movements down to a scale necessary to move actual atoms around, once
simple and functional machines are actually built that can run, and can
be built repetitively and reliably, then we will have brought the
industrial age to the nano scale, and can expect a progression curve at
the nano scale much like out own from the early 1800's onward. The
advantages we now have is that we already know how things work best at
macro scale, so its not like we have to totally reinvent anything, we
already know whats possible.
> (This is not to say that I'm skeptical of nanotech. I think the more
> prosaic nanotech scenarios have an enormous potential to revolutionize
> our world. If we can design nanomachines to construct individual parts
> of more complex systems, for example, and then use macro-machines to
> assemble them into the finished product, we will have gone a long way
> towards eliminating scarcity.)
> I'm very interested in thoughts from the list on:
> 1) Where a Spike subject to these constraints will begin to level off.
I don't think that we will see any significan leveling off. There will
be short periods perhaps of slowing growth, but as established
technologies reach their peaks, newer ones will come to the fore as they
become economically feasible, like a multistage rocket dropping off
> 2) Just how severe and insurmountable these constraints are. Will
> superstring-engineering allow us to bypass Planck's constant? Will some
> new, higher-level theory of complex systems provide more computationally
> efficient means of simulating chaotic systems? Will quantum computing
> have any benefit here? Can a sufficiently advanced entity construct a
> pocket universe where Planck's constant and c are different from our
> own? Is there any way to communicate between that universe and this
I would guess that the closer the physical laws in a pocket universe are
to our own, the easier communication will be.
-- TANSTAAFL!!! Michael Lorrey ------------------------------------------------------------ mailto:firstname.lastname@example.org Inventor of the Lorrey Drive MikeySoft: Graphic Design/Animation/Publishing/Engineering ------------------------------------------------------------ How many fnords did you see before breakfast today?