Re: How to help create a singularity.

From: Eugene.Leitl@lrz.uni-muenchen.de
Date: Mon Apr 30 2001 - 03:06:12 MDT


James Rogers wrote:

> On one hand, Python and Java are great learning and implementation

I would suggest using Python over Java, especially since there's
Jython (a Python written in Java). Yes, in case you've been
wondering, Guido pays me to plug it at every opportunity.

> languages, hiding a lot of nasty details from the programmer and generally
> making clean, tight implementations easy. The problem is that most new

Fewer lines of code, easily readable, cleanly structured, with a small
price to pay at execution time. What more does a hacker's heart want?

> programmers seem to stop at that level of abstraction. After all, what do
> you need C for when you can program in Java? Many CS programs, particularly

I don't know about Java (can it at all easily interface to C?), but many
Python projects use C modules (Python has been designed to interlock
well with C), and almost every nontrivial project uses a mix of embedding and
extending Python (CPython (the canonical one) is written in ANSI C),
and Python comes with tools like SWIG, which automate tasks which
would be repetitive and tedious to do manually.

> the compressed ones, have completely discarded assembly language and many
> have either dumped C or made it optional.

You mean, C is not an assembly language? Just kidding ;)
 
> My company has been hiring recent graduates of these programs, all of which
> have proven to be bright and productive programmers, but whose "lowest"
> level language experience is Java. While this works perfectly most of the
> time, a problem occasionally emerges in that they all program for an
> idealized hardware platform -- they are completely unaware of how the
> software they write *actually* interacts with the hardware they work on.

I've found a source level debugger (I use ddd) with a machine instruction
view on the traced program to be quite pedagogical. x86 assembly stinks
(as opposed to Alpha or StrongARM), so I do not intend to learn it,
if I can avoid it, but it's nice to know you can, and the debugger
will assist you in that.

> Most of the time this isn't an issue, but when we start to push hardware
> limits, the old farts need to be called in to take care of architecture and
> design issues that aren't obvious to people who aren't very familiar with
> the hardware.

Not surprising. This must be pretty bad in defense and aerospace, where
you need tight, reliable, hard realtime code.
 
> I am not saying that C should be a first language, but I think a strong
> argument can be made that it should be your second or maybe third. I never
> use C for general application programming, but there are many things I work
> on that could not be done well without it.

When I say Python, I usually always say Python/C. It's a pair which are
quite complementary.

One of the nicer points of Forth on Forth hardware is that you can descend
to the barest bits and ascend to very high level code in a heartbeat. And
given that a Forth CPU fits onto a screenful of VHDL, and (if manually translated)
compiles into 10-20 k transistors, it cloaks the bare silicon level surprisingly
well.

One of the things which convinced me that IT is very irrational is
that Lisp machines and Forth machines are either extinct, or have retreated
into very deep niches, having become very invisible.</lament>



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:00 MDT