Re: Homeless, (+ Jobs, Lots of stuff about Software world)

From: Eugene Leitl (eugene.leitl@lrz.uni-muenchen.de)
Date: Sun Sep 17 2000 - 02:38:39 MDT


Samantha Atkins writes:

> I doubt the "we create more jobs one level up" idea or that it
> ameliorates the problem much. I doubt we created more programmer jobs
> than the total number of factory workers, secretaries, typists, human
> number twiddlers and so on displaced by automation. Even if we did

People are displaced to find new jobs. In places with good economy,
there are not enough construction workers or people who can use a
metal lathe (what is the unemployment figure in California right now?
5%, less?). There are postindustrial patches of the world with 15%
unemployment, but you'll notice that there no one is hungry, either
(with the possible exception of unemployed single mothers -- rather
dumb, as raising an uneducated child is a net loss to the
society. Smart people is a country's greatest resource).

> these next level jobs are reguiring progressive higher skills and higher
> intelligence. Looking at the bell-shaped curve for human IQ leads me to
> the conclusion that a lot of people are not going to qualify for these
> next-level jobs. What should happen to them?
 
It's less IQ then attitude (and age). People who won't or can't learn
should retire early, to not stand in the way of, say, bright educated
immigrants. We're supposed to be rich, right? Why can't we retire
people early, in more or less style?
 
> You are seeing them in Europe more than in the US. In the US people on
> welfare, and the chronically unemployed and a few other categories are
> not counted. So the statistics are a bit misleading. Also, I wonder
> how many jobs in the US are there simply to keep people employed and

A person employed is one less person on the dole. If the state doesn't
do it, private companies should do it. Many people define their
identities from employment, and quickly die if fired, even if able to
afford a modestly wealthy retirement.

> really are pretty pointless today. From what I've seen as a consultant
> in large corporations, the number of such is not small.
 
Do you have and q&d solution to the problem? I sure as hell don't.

> We are in agreement on the problem. I am not sure what to do about it.
> We are at an interesting place. There is not quite enough wealth to
> both give everyone a good guaranteed income and let them decide what if
> anything to do with their time including working for more money if they
> have the skills needed. we're arguably not quite there yet. But

I would doubt it. Trouble is, a guaranteed minimal income will reduce
productivity. Right now quite a few people in Germany will not bother
to work, because they would not earn noticeably more than being on the
dole.

> closer. And a pretty good argument can be made that without
> concentrations of wealth in private hands their is no energy/resources
> to enable many types of innovation. Of course there are also

Good point. Robbing the rich to feed the poor makes us collectively
poorer. We'll never make it into space if there's not enough loose
cash in the pockets of wealthy geeks. I think there *is* a window of
opportunity, and we already let a lot of know-how (there is a reason
it's called "rocket science") trapped in people's and group of
people's heads drift to /dev/null.

> innovations dying in 9-5 jobs taken just to pay the rent that don't
> leave much room or energy for the passions and true interests of many
> workers.

A nominal 9-5 (or part time) job will pay for the rent, and leave time
for the hobbies. If I can't do truly interesting things in my job, I
could as well let the nominal day job pay for my rent, and do
interesting things in my spare time.

> It will be worse than bad if most of the energy goes into endless
> competition for more material wealth when material wealth is no longer
> that difficult or central. It will also likely be bad if

What motivates 95% of the people is money, and the material wealth
they can buy for this money which will eclipse of what other people
have. This is your set of boundary conditions you have to work
with. If this means we have to paint McDonalds ads on rockets or send
rich tourists to LEO, so be it. Millions of people playing games and
running bloatware apps pushes supercomputer clusters forward. Don't
diss the unwashed, they drive progress, albeit in their inefficient,
circumspect way.

> people/groups/nations are bent on destroying or exploiting one another
> with those kinds of abilities in their kit.
 
Coevolutionary artefacted competition brought you the goodies, so you
have to tolerate a little rat race. Corporations fight with lawyers
and on the market, not with cluster bombs, fuel-air explosives and
biological weapons. Rather let corporations fight than nations.
 
> Don't bet on it. What changes between then and now is the raw power of
> the hardware. As it becomes more powerful it becomes more tractable to
> automate large segments of the work programmers currently do. Of

Sure, now it's easier to make a GUI, by just painting it on the
screen. Program generators are not exactly new. Apart from wizards
like our very own James Rogers (and sci things like ATLAS,
automatically juggling source to optimize for a given architecture) we
don't see automatic programming hitting the streets any time soon. And
then people would still have to write specs in a formal language. Even
if you don't write a protocol stack explicitly, you still have to
codify it's behaviour.

> course, the demand for programming is so huge that it will take a while
> before the automation cuts into the need for programmers too much. But
> the level of programmer needed will continue to rise also, stranding
> many who are now in that profession. Don't believe me? Wait and see.
 
Sure as hell you'll find easier employment if you know Javur and XML,
the whole web monkey business seems to just disintegrate in fractal
bricabrac. Know many 40-year olds who can pick up Linux/Perl/Zope, or
learn Python in a weekend? Even if they do, how many interviewers will
believe them? Of course, a *real* C programmer in a Unix environment
is still a force to be reckoned with, but I haven't seen a lot of job
ads asking for these.

> Actually, having been in the software world myself for 20 years, I
> suspect the truth is a bit dimmer than that. Many of the tools I use
> haven't evolved hardly at all in all of that time. C lead to C++ but
> the tools used to graft C++ have not exactly gone through any major
> revolution. C++ itself is quite primitive in many ways. Java?

Jeez. If you think C++ is an improvement upon C you really have a
strange mind. I do not see anything new cropping up since
Lisp. Because I can't have Lisp machine in current technology (and am
too dumb/poor to afford a DSP cluster running Forth), I've settled on
Python/C in a OpenSource *nix environment (currently Linux).

> Interesting things have been done to exploit some of its features but
> the language itself is not that powerful and not sufficient for many
> types of problems. Any interpreted or semi-interpreted language with

What else did you expect of object-oriented C? Ok, it's got a garbage
collection, around the turn of the millenium, holy Alonzo.

> equal or more reflection could be used in most of the contexts that Java
> is used. Some of these languages, such as Lisp and Smalltalk, are or
> have been much more powerful and advanced in capability, usage, or
> development environment than Java, C++, VB and so on are today. Most of
> the central abilities in languages were first invented and explored in
> Lisp.
 
Absolutely, but you're not exactly employable if you tout Lisp as your
primary development language. Smalltalk is hardly better, though it is
still being used in niches.
 
> We are beginning to address problems of programming in the large but
> frankly many of the solutions are giant kludges that are severely
> over-hyped and over-sold. I have gotten quite disgruntled with this
> industry. We spend more time trying to lock up "intellectual property"

Amen, verily, etc. etc.

> and out-hype the competition than we do actually designing and building
> good systems. And fixing our development tools themselves takes a
> backseat to even that. I designed and built things in the 80s ( and I
> am not unique in this at all) that are as or more advanced than some
> parts of the current highly-hyped baseline.
 
I've seen some nifty packet-driven realtime stuff on an
embedded. However, as people keep the nifty stuff closed-source as a
competitive advantage, the field as a whole won't go anywhere.
 
> Sorry. Most of that is an aside and off-topic. I needed to rant. But
> personally I don't think software development will get significantly
> better until something like Open Source (better add Open Design) and
> changes in the basis of software business occur. I don't see how the
> current model has room to get out of its own way.
 
Luckily, you can make a (probably relatively meager by your standards)
living doing OpenSource support and development already, though I have
not checked that in person (but Ars Digita & Co seem to be doing
nicely).

> [anti VB rant, amen]
> Putting various things in the OS is not a very bright idea. The things
> in the OS should be the minimum that can be handled more efficiently and
> cleanly there. A web browser is not an example of such.
 
Unfortunately, even the relatively clueful Linux people do not
understand the importance of having a (10-15 kByte) asynchronous OO
message passing nanokernel at the core of Linux. Things like embedded
and PDAs, and supercomputer clusters, especially those with embedded
memories (forthcoming) are niches, but they'll be closed to Linux if
it will go further towards Bloatland.
  
> VB will not allow you to do most of these things. Back when I (shameful
> to admit) hacked some VB I had to write anything interesting in C++/DCOM
> and then do the GUI in VB and have it call the C++ COM objects. It
> works, but is totally proprietary to Microsloth.
 
I don't touch proprietary systems, period.
  
> You must be in a very different world or you don't see beneath the
> wizards and the push-button IDE. The underlying stuff has largely not

God, we have Unix since three decades. We've had the Lisp machine,
USCD-p on CP/M, Forth, Smalltalk boxes, we've had the Apple ][, the
Lisa and the first Mac, and Amiga, and NeXT, and several others, and a
lot of this happened before 1990. Apart from the raw performance
(granted by the economies of scale since accepted as industry
standard), Wintel as an architecture is a joke.

> improved at all except for the addition of binary components using CORBA
> and COM. Even those two are not a great improvement over a
> distributed/persistent peer-peer object application environment I
> created in 1986-1987. It was unfortunately ahead of its time and
> created under the auspices of a company that had no idea what to do with
> it.
 
If it's mainstream, it can't be any good.
 
> Our present underlying compiler and linker technology is not much better
> than it was then. Linkers have changed almost not at all. DLLs? Came

You've lost already when you talk about compilers and linkers. I much
prefer intactive incremental "compilers", especially Forth on
dedicated hardware (you can implement a feisty Forth CPU in ~12
kTransistors, and implement a complete OS with IDE in ~10 kBytes,
though much more advanced systems can take up ~100 kBytes, or
so. Inclusive GUI, of course).

> out in the mid 80s. The productivity tools for programmers leave a lot
> to be desired. In the Microsloth world to browse the call graphs of
> functions and objects the software entity must first be fully compiled
> with a bunch of special purpose flags set. From one component you can't
> browse into such details of another one. The information is not unified
> into some database you can query about various software artifacts and
> their interaction and inter-dependencies. What data that is gathered is
> in a Microsoft proprietary format that you cannot use to develop
> something more intelligent. Yet Lisp and Smalltalk environment have had
> such abilities for the last decade or even two. I wrote such an
> information extractor myself for some <gasp> Cobol legacy stuff I got
> stuck with once in 1984.
 
You see, those people don't even understand the words coming out of
your mouth. We live in a world where Bill Gates has invented
computing, software, and the Internet, singlehandedly. Saying anything
else will only convince them that you're 1) crazy 2) lying, or both.

> The reason it is getting more difficult is the systems needed are more
> sophisticated and the tools and infrastructure have not kept up with
> them. It is difficult to pour energy into better programming tools when
> that kind of product doesn't pay off so well at the bank to make the VC
> happy. It is difficult to build them in house without the sponsoring
> management looking like they don't have an eye on the bottom line. So
> we race faster and faster armed with inadequate tools and every more
> pressing requirements, busily trying to automate everyone else's work
> except our own.
 
Good programming resembles bonsai care. Not many people are keeping
bonsai today.
 
> How does this tie-in to extropian interests? The future and the tech
> we all so crave is built largely on top of software. Keep most software
> proprietary and don't invest in better software tools and the future
> will be much more stunted than it could/should be.

Not to mention system securety.

> It is more cranky and brittle than it needs to be in part because of the
> problems I mentioned. There are also some real bearish problems in some
> of our currently dreamed up systems. Things that will take real R&D

Parallelism, especially debugging massive parallelism in tiny grains
is certainly a formidable problem. It might be even too formidable for
a human mind.

> projects to solve and then under the current model would come out as a
> bunch of hacked up tools positioned to maximize profit instead of being
> shared across the entire industry that needs them so desparately.
 
OpenSource is moving, albeit slowly. No need to despair yet.

> > People have been worried about the contrary point of view; that our systems
> > are getting so big, so unwieldy, that at some point we cross a failure
> > threshold, beyond which we cannot, as a bunch of humans, reliably maintain
> > the systems any more. Why would this be true?
>
> Without the proper tools and without more and better automation it is
> inevitable.
 
I find it strange that a person in the software industry will find it
strange that there is no limit to complexity to what a team of humans
can do. There is overwhelming evidence that this is the case.

> My greatest expertise is in object persistence. Persistence is far, far
> from "automated". Persistence cross-cuts applications and products but
> is often done as a series of hacks within a particular project
> life-cycle. Or a product is bought that promises to take the worries

OO is far from being the silver bullet, i.e. code reuse by inheritance
from former projects does not seem to scale.

> all away but actually seriously perverts all application building
> thereafter because its needs have to be met for the application to work
> at all and its needs are to perturbing to everything else. And the
> solution ties the product and the organization often to the solution
> provider firmly. At the moment there is not a good persistent
> middleware out there that fully meets what is needed. There are various
> attempts of greater/lesser goodness. I plan to do large parts of that
> problem better and to eventually release a series of Open Source
> persistent middleware tools. I am tired of seeing ugly solutions to
> this set of problems I know well.
 
I'm interested on your opinion of http://www.python.org

> We have automated certain classes of GUIs but the GUI story is far from
> complete or adequate. Many projects are now being seriously perverted
> to use only a Web Browser as their GUI! It is as if we are stepping

Well, a web browser is an ubiquitous, easy way to control a system,
and you can put it within few 100 Bytes of assembly. A remote GUI is
not something too bad.

> firmly into the past and putting more and more energy into centralized
> servers even though we have more power on our desktops than we dared
> dream of in the very recent past. We need good massively distributed
> peer-peer systems. Not a sleazy 21st century rework of time-sharing
> systems.
 
We do have an interest in peer to peer information sharing (though not
yet collaboration) rising recently. Parallel applications (based on
PVM/MPI message passing libraries), including cluster file systems are
fairly widespread in science, and soon commerce. High-availability and
high-performance clusters for commerce are the hottest topics right
now. Things are not hopeless.
 
> Sure. Although of a quite different sort. What disturbs me is how
> often I am still doing the same tired old tasks in much the same tired
> way. There is not often enough time to both meet the current
> over-inflated deadline for an underdesigned product and automate as much
> of my own process as would satisfy me.

Thanks for reminding me why I decided not to become an IT whore.
 
> > Sure the techniques will change. Sure the skill set required will change.
> > But the basic programming job will remain, and grow wider in my opinion.
> > Lots of things will begin to look like programming in the future, which do
> > not now. Biotech might get to a point of "automation" where it starts using
> > programmers. Through nanotech, even the bricks & mortar world will start to

Molecular nanotechnology and molecular biology does not require
programmers, or at least a very special brand of programmers. You need
to cover 2-3 fields, and not be a narrow specialist, or collaborate
tightly with other specialists, which will necessarily lack certain
synergies.

> > become a programming concern.
>
> We don't yet do enough with well-defined and trusted components and with

You sure as hell can't trust proprietary components, that's for sure.

> good tools for finding the right components and simulating their
> interaction. Much of our code base is still language and OS dependent
> and not componentized at all. Most of code is still application
> stovepipes with little/no reuse or reuseability. In short, almost no
> automation or next-level engineering applied to our own work. It had
> better not continue like this.
 
I don't see happening much in the evolutionary vein, we will probably
only see progress when evolutionary algorithms become productive,
which right now is utterly unpredictable.



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:38:17 MDT