This is at EET:
I can't determine whether it's hype or a genuine
breakthrough. If genuine, we have the potential for
a major near-term performance improvement for a
large and important class of algorithms.
I do not know if any AI/SI algorithms can be
mapped into this class, or if we can design a new
approach that maps into this class. I'm almost
certain that pattern-recognition algorithms can benefit.
If we can, we can get a short-term factor of 1000 performance
gain, which is equivalent to 15 years of Moore's law. The article
implies that the rate of improvement of this technology
will exceed Moore's law, also.
Reading between the lines, the technology appears to
use the ability of optical components to operate on
a massive amount of data in parallel. Reading the
tea leaves another way, it may be an elegant way to
do high-precision analog computing in the optical
domain. Does anyone on the list have enough background
This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:12 MDT