RE: Human minds on Windows(?) (was Re: Web site up! (GUI vs. CLI))

Billy Brown (ewbrownv@mindspring.com)
Wed, 7 Jul 1999 22:24:41 -0500

> > CountZero <count_zero@bigfoot.com> said
>
> > ... Windows (tm) ...,
> > is a bloated mess, likely impossible for any single human to understand,
> > _it works_, it gives me what I want at the moment and I'm more than
> > willing to throw hardware at it as long as the hardware is cheap since
> > the alternative is to wait (possibly for a long time) to get the same
> > capabilities in properly optimized code.
> >
> This is a *very* *very* scary thought. Since we can expect the
> hardware to keep getting cheaper at least through 2012 (when they
> hit the five atom gate thickness limit), then probably transition
> over to nanotech (whence comes 1 cm^3 nanocomputers) -- the implication
> is that we will have an extended period in which to develop increasingly
> sloppy code.

Actually, Microsoft's defects-per-LOC figures (the only comparison of code quality that really tells you anything) are in the upper 30% of the software industry. The reasons why their products often seem porely written have nothing to do with code quality - their problems lie in other areas (such as user interface design).

However, you have definitely hit on the single biggest challenge facing the software industry today. Simply put, it is not possible for humans to write 100% defect-free code. Faster computers allow you to tackle more complicated problems, but that leads to ever-bigger programs. As the programs get bigger, more and more of your development effort gets diverted into getting the number of defects down to an acceptable level. Eventually you reach the point where every time you fix one bug you create another, and it becomes impossible to add new features to your program without breaking old ones.

Judging from what data are currently available, this effect comes into play when the number of human-generated instructions gets into the 10^7 - 10^8 LOC region. High-level languages should therefore make it possible to write bigger programs (because each human-entered instruction generates a lot more machine code), but the level of abstraction in these programs is not increasing very quickly at all. If we want to actually be able to exploit the processing power we're going to have in 10-20 years, we need to get to work on this problem *now*, instead of sitting around pretending it doesn't exist.

Billy Brown, MCSE+I
ewbrownv@mindspring.com