Re: "analog computer" = useless hypothesis?

From: Ken Clements (Ken@Innovation-On-Demand.com)
Date: Sat Mar 31 2001 - 22:17:10 MST


Jim Fehlinger wrote:

> These are extraordinarily suble issues, which the most brilliant
> minds in the world have yet to come to grips with completely.
> There's no point in trying to divide this list, or any
> other realm of discourse, into tribes of "analogists" and
> "digitalists" and then picking sides.
>

Well put. This is one of those differences that does *not* make a difference.
Many times over the years I have had to remind my digital design engineers that
the circuits they are creating just look digital within a bounded set of
circumstances, and that without careful controls, the manufacturing guys can
screw up those circumstances. Then I have also had to remind my linear circuit
designers that electrons can cross transistors one at a time, and that very small
signals will be mixed with shot noise that is a random quantum effect. All of
circuit theory is a special case of Maxwell's equations, and my physicist friends
keep reminding me that those equations only hold on a scale big enough to save me
from sitting there summing Feynman diagrams.

Another problem is that most people assume that analog equals continuous. This
is not necessarily true (Eugene pointed out that we have yet to find a way to
make it true). I can have a system that represents, say LP grove width, with an
analog signal having discreet allowable voltage levels. When the grove gets
wider, the voltage gets higher; just because it does it in steps (quantized) does
not mean it is digital. If it did, we would have to argue at what step size do
you cross the line. If I connect two such quantized analog signals into a
multiplying amplifier, it will still give me a quantized analog output, but it
may have to make some internal roundoff decisions that are of a digital nature if
it does not have the dynamic range to preserve all the information.

Contrast the example above with a device that is more clearly on the digital
side. Here I read a stream of ones and zeros that encode a signal level. As the
signal becomes stronger and weaker, the pattern of ones and zeros changes, but
one or zero voltage levels do not. If I want to multiply two of these values
together, I can send the patterns to a digital device that may make the same
roundoff decisions as above, but in this case, it could also expand the length of
the patterns, and preserve all the information (presuming I can trade off time or
have a means to increase bandwidth).

Likewise, digital does not necessarily mean mathematically perfect. If a chip
salesman came to me and tried to sell me gates that only gave the mathematically
correct result %75 of the time, I would show him the way to door, but if on the
way to the door, he told me that they were cheap and 1000 times faster than any
other, I would stop and get out my check book. It is well known that reliable
systems can be made out of unreliable parts, which starts to make them look like
systems of analog stuff again.

F. Dyson's argument about having plenty of effectively continuous state levels
near the end of the Universe doesn't do much for me because (1) I do not believe
any projections past Singularity and (2) he has to let his system get arbitrarily
large to get the states, which means that speed of light restrictions would cause
computation to grind to a virtual halt anyway. As to the noncomputable numbers
argument, if the result of solving some equation is Pi, Macsyma or Mathematica is
going to tell me the answer is Pi, or give me an infinite series to sum for it.
Works for me.

The main point is that the analog/digital difference only makes a difference in
special cases, and in the big picture does not get you anything useful. So go
ahead J. R., add it to the list.

-Ken



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:44 MDT