We've even seen OO, though it will yet
take a decade for it
to truly hit the mainstream.
(If it had,
distributed OO realtime embedded systems
wouldn't be mere lab curiosities).
What cometh our way, next? Can we tell, already?
And why should we care?
I believe we should. Though computer monomaniacs'
sensory homunculus shows habitually gross distortions
of projected reality (silicon featuring prominently),
the impact has been very real. More importantly, it will
grow even more pronounced in the near future.
If nothing else, one should know one's enemy.
A trivial fact: technologies don't spring into bright bloom instantly.
Usually, the TNG wonder gadgetry lurks in industry/academia
labs, ogled suspiciously, for years, if not decades, before it
finally flowers fully.
So probably the next solution, to remedy problems which do not plague
us yet, is already out there. We are curious, naturally, so what might
its outlines?
Since hardware and software are tightly coupled, as two spent
swimmers, which do cling together, we can't expect too much
from the hardware sector too soon. Clearly, the era of monolithic
monoprocessor machines is drawing to an end. Oh yes, I know
that has been heralded for years. To no effect. But the SMP
business is now quite there. As the infamous i80xxx chip
line is but an artefact of clever marketing, excellent products
solely, if judged using the "maximization of profits over time"
success metric, it won't last forever. Having tasted first fruits
of modern telecommunications, entertainment, and industrial
automation, we long for more.
I won't hazard a time-line, but...
Multithreading as bottom-up, PVM/MPI as top-down, monolithic
systems are starting to crumble, dispersing themselves over
threads, over CPUs, over nodes. DSPs with multiple CPUs on
die have appeared. TI's new TMSC6x family, with low complexity FLIW
cores, memory on die and links is pointing in the right direction.
Since core on die is a must (due to vastly better access time,
lack of bonding pads and associated silicon real estate and power
dissipation losses), yet chip yield must be excellent to make
the maspar system affordable, node complexity will be low. This
is even more stringent for wafer scale integration (WSI), which
will liberate us from vagarities of multilayered motherboards.
As integration densities will rise, clocks go up and signal
levels go down, geometrical constraints will start to bite. Ouch.
This means buses (lest they would spring all bonds of modesty
with their girth) will have to go the ways of all trash. Serial
short high-speed buses, first electronical, then optical,
will shuffle in their place. OS architecture will be forced
to the OOP nanokernel flavour. We will have virtual machines,
and JIT compilers. Since code density, due to small grain
size, will need to go up, we'll see a brief renaissance
of threaded code. Which will force fast context switches,
and a dedicated hardware on-die return stack. Rejoiceth,
ye Forth followers, for ye finest hour is at hand.
So your next desktop might feature hundreds of processors
(on one, or several motherwafers), interlinked by wafer-local
serial links, network routers/switches on-die. We will
have a modest RAID, linked by fiber optics, and possibly
even solid-state holographic memory. The thing will listen,
and speak. A DLP/DMD-like beamer device could be the output.
Augmented reality headup will accomplish the same much
cheaper, however. People won't turn and stare anymore, if
you don your designer wearable every morning. Who ever
selected "beam" from the Newton communication menu, knows
full well why Borgs have their photogenically blinking
semiconductor laser. It is so darn convenient. Besides,
intercepting a tightly focused face2face laser beam is
virtually impossible, while monitoring cellular traffic,
even if it all crypted garbage, is a piece of cake. Besides,
a NIR laser is also handy to light your way, a wearable CCD
camera assuming.
The same universal optical links could bind the local web.
Overlapping local web patches will spin internet++, the next
Big Thing. Matrix perversion will become a very serious problem,
whatever hardware protection schemes and virtual machines
will form our node-local firewalls. (A certain company, which
I will not name, has now a smart card developed, which will
house and run applications, written in virtual machine assembly.
To counteract possible perversion, both the specification and
the sources are kept secret. Ha-ha. As if one couldn't read
the bit pattern directly, with finite effort. Or simply run
a crashme. (Uh-oh, my wallet has crashed. I'm broke.) I'd
like to see that virtual machine which will terminate
randomly generated code cleanly).
Sounds neat, eh? Of course it's all wrong.
For one, I didn't mention reconfigurable hardware. Back in the
1987, a need for virtual hardware has been already obvious. Now
EHW, evolvable hardware, evolving mindlessly, promises to deliver
fuzzy solutions to our succinct specifications. So reconfigurable
hardware will start gaining a progressively prominent piece of
silicon real estate. However, EHW is not enough. Darwinian
bricolage does not work well (heck, not at all) on brittle
systems, and logic gate assemblies, sadly, bristle with
brittleness. So we need a different paradigm, as, while our
sensorics is excellent, and motorics adequate, the processing
prowess is simply nonexistant. Our hardware must form a cozy
container for a tasty bitsoup, swirling madly in its GA dance,
mapping sensoric vector to the motorics one. This soup must
not be brittle. This soup must run on a monster of a hardware,
featuring gigantic integration densities, ridiculous numbers
of identical elements, an atomic hardware simplicity which
is trivial beyond belief. Virtual hardware will mend.
What for?
We need cleaning robots, both for the flat, the street, and
the PV array. We need nannybots, to cater for the sick and
the elderly. We need heavy-duty machines which will prospect and
mine, and delicate agribots, which can pluck a fruit fly from flight.
We need realtime neural compression, and decompression. We need
systems, producing solutions, to our specifications, on time.
Like yesterday. We need surplus neural performance and bandwidth,
to be skimmed from the web as the daylight load flood passes on,
leaving nocturnal low load tide, to be thrown at problems at hand.
Like, building even better mousetraps. To catch your molecular
switch, and the bit of string.
Why are we working at making ourselves obsolete, and so hard?
I don't know. Probably, it's simple self-destruction. Don't
judge the neo-Luddites so hard. Sooner, or later, we all will
face _personal_ white-collar unemployment. Inasmuch ubiquitous
welfare is compatible with free markets is open to doubt. What
is not open to doubt, is that a closed autoreplication loop
does not need flesh to operate. By then, you either are a
integral part of the machine, or, obsolete.
And we all know what happens to low-fitness species. Rapid extinction.
So root hog, and/or die.
ciao,
'gene