Re: Blue Gene

Mike Hall (
Tue, 07 Dec 1999 01:27:27 -0500

"Robert J. Bradbury" wrote:

> If they are taking an example out of the GRAPE book, they are designing the
> instructions for molecular modeling.

Perhaps, but....
I read the IBM announcement on their web site, and a New York Times article that had a little more detail (can't vouch for its accuracy: Neither piece had enough detail to really say what IBM's design plans are, but among the few facts I gleaned from there were:

        The "instruction
        set" -- the total vocabulary of machine-language
        instructions a computer understands -- will number 57
        for Blue Gene, compared with about 200 for most RISC
        machines.  (NYT)

57 instructions is not a lot for a processor - Intel 486 has around 170, the IBM 390 (extreme CISC) has over 400. Of course these are general purpose processors, and a special-purpose architecture like GRAPE probably has many fewer instructions; however, I would expect many of those instructions to be relatively complex to render in the hardware:
(From IBM's announcement):

                       "We call this new approach to computer
                       architecture SMASH, which stands for Simple,
                       Many and Self-Healing."

                       The SMASH architecture differs from existing
                       approaches in three ways:

                           It dramatically simplifies the number of
                           instructions carried out by each processor,
                           allowing them to work faster and with
                           significantly lower power and chip surface
                           requirements (the traditional approach is
                           to add complex features to gain
                           It will facilitate a massively parallel
                           system capable of more than 8 million
                           simultaneous threads of computation
                           (compared to the maximum of 5000
                           threads today);
                           It will make the computer self-stabilizing
                           and self-healing -- automatically able to
                           overcome failures of individual
                           processors and computing threads.

To me, this implies that their approach to the hardware design is unit simplicity for raw speed, massive redundancy for fault tolerance and throughput. It seems to me they intend to handle the complexity at a higher level (in the software).

> They may have sat back after
> doing the molecular modeling stuff and said, ok, now how can
> we turn the machine into a good data-mining computer, or a
> good speech recognition processor, or a good image processor
> and added some more instructions to round things out.

Maybe, but I've seen nothing in the published material that says this is anything other than a general-purpose machine. But again, the facts in these pieces are somewhat meager. I'd like to get a peek at the instruction set if they ever deign to publish it.

But even if I'm right, the task of designing software to make full use of the machine's capabilities may be so daunting that no one else will want to take it on, effectively making it a single-purpose machine. And this is likely the only one they will build, like Deep Blue.

Must sleep........

P.S. I apologize for my sloppy editing on my original post (which was truly my first post to this list).