Re: Optical DSPs promise tera-ops performance

From: Dan Clemmensen (
Date: Tue Oct 09 2001 - 17:56:43 MDT

James Rogers wrote:

> On 10/8/01 7:07 PM, "Dan Clemmensen" <> wrote:
>>I can't determine whether it's hype or a genuine
>>breakthrough. If genuine, we have the potential for
>>a major near-term performance improvement for a
>>large and important class of algorithms.
> Current top-end DSPs do tens of Giga-ops, so Tera-ops isn't a huge
> performance stretch, particularly since the technology won't be available
> for many years.
> Unfortunately, DSPs have a limited algorithm space largely because they have
> very limited RAM capabilities, being unable to effectively address even
> modest quantities of RAM. Given that DSP manufactures don't have any
> super-secret process technology, it is pretty obvious that they are giving
> up something to fit the extra processing units on the die. AI-type work
> needs tons of RAM and proper hardware memory management, so at best DSP
> might be useful for offloading certain types of computationally intensive
> operations (e.g. real-time audio analysis).

True but perhaps not relevant. Its clear from the article that the
innovation is not really a DSP. it's more analogous to an ASIC in
one form and an FPGA in the more interesting form. The reference to
DSPs in the title and in the article refer more to the devices that
will be replaced rather than to the new technology. The hot new
bleeding-edge application for the DSP is software-defined radio,
which in its extreme form implies direct digital synthesis of the
RF waveform without an IF stage. The new technology appears to be
able to perform "transforms" in the optical domain: a "transform",
(such as FFT) can apparently be modeled as a massive combinatoric
net, and these folks appear to believe that an equivalent of this
net can be implemented optically. The only "DSP" part of all this
is the stuff that performs the relevant electrical-optical conversions.
The major claimed advantages are cost/TOP and power/TOP. Presumably, an
AI program would run on a general-purpose processor and control a bunch
of these thingees, each implementing a fancy transform performing
a pattern-recognition or discrimination transform.

This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:12 MDT