Robert J. Bradbury wrote:
> > Billy Brown[SMTP:email@example.com] wrote:
> > On a tangential note, you seem to think that large programs are
> > bad in some sense. Why? A high-quality program is one that does the
> > things I want it to do in a simple and cost effective manner. As long as
> > you have that, who cares if the program is 5 KB or 5 MB?
> Oh boy, we can tell what era you learned to program in... :-)
> The first versions of UNIX ran in < 64KB. Compare the Linux kernel
> today with UNIX 25 years ago. It doesn't do that much more but
> it sure is a lot bigger. [And UNIX systems 25 years ago were
> supporting 10-30 users simultaneousy!]
Yes, I know. I learned to program on a TRS-80, and my programs were rarely more than 1K. However, times change.
Twenty years ago it made sense to lavish extravagant efforts on a quest for small, efficient code. The machines of the day were so slow that if you didn't do this, you couldn't make a usefull program. Unfortunately, most programmers still seem to be stuck in that era.
With modern hardware you don't need to do that anymore. Instead of devoting 50 man-years to build a hand-coded assembler program that runs in 50KB of memory, you can spend 1 man-year to write a program that does the same thing but needs more like 500 KB. The idea that there is something wrong with that approach is a serious mistake: the scarce resource today is human effort, not memory. The same goes for speed optimizations .
The smart way to write an application today is to concentrate your efforts on good overall design, appropriate feature sets, user interface issues, etc. If the final result is a little slow, you can always go back and re-write the most resource-intensive 1% of the program. Most of the time you won't even need to do that, because your program isn't going to put a serious load on a modern system. Either way, you will spend far less money and end up with a much better (in the user's opinion) program that you would if you obsess on efficiency issues.
Billy Brown, MCSE+I