Ken Clements wrote:
> I wrote:
> > The analog/digital dichotomy, both at the macroscopic level of
> > actual physical computers... of the present day, and at the
> > hypothesized "quantum foam" level of reality, is a paradoxical one.
> > The "digital" nature of any real machine is an idealization, an
> > abstraction -- if you hook up an oscilloscope to the innards of a
> > computer, you see things happening at the **analog** level -- square
> > waves aren't perfectly square; they have [non-zero] rise and fall
> > times, etc. At the quantum level, you have things taking
> > on "digital", quantized, discrete states, but the closer
> > you get to something the harder it is to be certain which
> > of those states it's in, which brings back the "analog"
> > element in terms of continuous probability distributions.
> > These are extraordinarily sub[t]le issues, which the most brilliant
> > minds in the world have yet to come to grips with completely.
> > There's no point in trying to divide this list, or any
> > other realm of discourse, into tribes of "analogists" and
> > "digitalists" and then picking sides.
> Well put. This is one of those differences that does *not* make a difference.
> Many times over the years I have had to remind my digital design engineers that
> the circuits they are creating just look digital within a bounded set of
> circumstances, and that without careful controls, the manufacturing guys can
> screw up those circumstances. Then I have also had to remind my linear circuit
> designers that electrons can cross transistors one at a time, and that very small
> signals will be mixed with shot noise that is a random quantum effect. All of
> circuit theory is a special case of Maxwell's equations, and my physicist friends
> keep reminding me that those equations only hold on a scale big enough to save me
> from sitting there summing Feynman diagrams.
> The main point is that the analog/digital difference only makes a difference in
> special cases, and in the big picture does not get you anything useful. So go
> ahead J. R., add it to the list.
Well, whether or not reality is "at bottom" analog or digital (if that
distinction even makes any sense), or whether the whole universe is
actually a Turing-machine program running on, in Eliezer's words, "one
honkin' big computer" (with simulated randomness provided either by a very
sophisticated pseudo-random number generator, or by a **very** long list of
numbers recorded from some source of natural randomness in the parent universe),
there is one salient practical difference in the real world between analog
and digital approaches to technological problems.
It generally takes a lot more **stuff** to serve any given technological
need by building a digital solution than by building an analog
one. To take another consumer electronics illustration -- an LP-based
stereo system has a signal transducer consisting of a stylus and tiny
magnet at opposite ends of a cantilever, with the magnet wiggling near
some coils of wire (or vice versa, with the cantilever connected to coils
wiggling near a fixed magnet), together with a handful
of transistors (or vacuum tubes!) used as linear devices, providing signal
amplification. By contrast, a CD player employs literally millions upon
millions of transistors, used as switches in a digital system, to
achieve the same functional result. It's only the spectacular advances
in large-scale device integration and the economies of scale achieved
by huge manufacturing plants churning out millions of such devices
that have made the digital solution cheap enough to be practical. Similarly,
the phone system started out analog, and only in recent decades has
moved to digital. In some realms, the cost barriers are still high --
ask a serious photographer if ve thinks sub-$1,000 digital cameras are
ready for prime time, and you'll probably get a demurral; ask again
if a $10,000 professional digital camera might be acceptable, and you
might get an answer along the lines of "yeah, but for that money I'll
stick with by 35-mm. film camera, thank you very much."
If, as Edelman and others insist, the capabilities of biological brains
cannot be explained without taking into consideration the staggeringly
rich variability of their biological construction -- the squirting
chemicals of synaptic junctions and the writhing pseudopods of neural
growth cones, or what McCrone calls in _Going Inside_ all that "organic
squalor", then simulating **that** level of biological reality using
crystalline arrays of comparatively simple, regular, and repetitive digital
processing elements really will take a Drexlerian revolution in the
miniaturization of digital computers. If the biological messiness is
to be simulated in software, then the hardware will have to consist
lots and **lots** of diamondoid processors or 3D nanotubes. Let's
hope we don't actually have to simulate biology at the molecular level --
the level at which life does start to look a little like a digital
computer, the level of DNA transcription and protein synthesis. But
Edelman himself says no, that won't be necessary -- in fact, he would say
that individual neurons would be an unnecessarily low level of
simulation, and that the appropriate level would be the level
of neuronal **groups**.
It seems a likely consequence of economics that if it really does
become possible to manufacture this sort of fine-grained digital
stuff (or "computronium", if you like), it will be necessary to churn
it out in huge quantities, just like chips are today, to make
the investment in such complex manufacturing processes economically
feasible (barring the availability of "bathtub nanotechnology",
which sounds a little far-fetched to me). Then, once the manure has
been spread, the seeds will sprout! On the other hand, if Bill Joy
has his way, maybe the Defense Department will impound the
technology in the interests of national security, its military
applications will be explored using our tax dollars, and Colossus
will be born in (top) secret :-< .
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:44 MDT