FYI:LONG:on reg day, spring 98, I left the Extropian movement to begin my life (fwd)

Eugene Leitl (eugene@liposome.genebee.msu.su)
Sun, 1 Mar 1998 18:14:40 +0300 (MSK)


[ forwards gone fishing ]
----- Forwarded message from Han -----

>From hhuang@MIT.EDU Fri Feb 27 02:58:20 1998
Date: Thu, 26 Feb 1998 11:44:08 -0500
Message-Id: <9802261644.AA10217@lockpick.MIT.EDU>
From: Han <hhuang@MIT.EDU>
To: extropians-announce@MIT.EDU, extropians-discussion@MIT.EDU,
extropians-moderated@MIT.EDU
Subject: on reg day, spring 98, I left the Extropian movement to begin my life

[ Please direct replies to extropians-moderated@mit.edu. ]

I may as well announce it now. 3 weeks ago, I stepped down
from the Extropians club that I cofounded with Jason Davis to finally,
at last give science my all.

Why don't I just forward the two emails I sent to the
extropians@mit.edu list? They say everything I want to say, and more.

And with that, this club, the still-derecognized MIT
Extropians, passes to whatever form it chooses to evolve itself into.
I'll still forward useful information. But whoever wishes to lead --
our lists, next year's pamphlet, our group, its guiding philosophy --
the door is completely wide open for you to do so.

And so, I wish you all the best -- that you may achieve what
you set out to attain.

-Han Huang
Co-Founder and Alumni, MIT Extropians
hhuang@mit.edu
http://mit.edu/hhuang/www
http://extropians.mit.edu

*--Forwarded message follows-----------------------------------------*

There's a reason for our 2-list breakup. It was a necessary
bifurcation. I mean this literally, as in a second-order phase
transition with two distinct phases. If time provided, I could make
the difference evident, sharp, but I've been too busy to even post my
"stepping down" email.

As for -discussion, I happen to think the topics are
legitimate topics. -- but still...well...sigh...off the mark in many
different ways.

Keep your eyes out for what happens in "large" intellectual
discussions. Right after our first Extropian gathering, I told Jason
that it's a deep fact that large forums left to themselves inevitably
default to talking about:

-politics
-economics
-religion

And lo and behold, left to itself, that is exactly what goes
through -discussion these days. These are *fine* topics, but I think
there are ultimately more rewarding topics. But maybe I'm putting the
cart before the horse. After all, I spent my years on politics,
economics, religion, was a Counterpoint Editor-in-Chief, cofounder of
SAFE (MIT's free speech group), the libertarians email list, and
Extropians. So who am I to talk?

These issues are the "contentious" issues involving large,
often powerfully organized power structures, paradigms, morality. I
was always known for asking people 3 questions, always:

1) are you religious? if not, what's your epistemology?
2) do you believe in immortality for humanity? are you personally
interested in it? why or why not?
3) what are your political views?

I file away all these answers. Ally or non-ally, that's one
of my first classifications. Folks, like it or not, life is
contentious and will remain so.

Well, I guess there needs to be a forum for politics,
religion, and economics, sprinkled with some immortality and science
talk. I only wish people would spontaneously discuss *these* issues:

1) what drives you, what do you want above all else
2) what could help you, what's holding you back from it
3) what are you doing about it, are you happy about what you're doing

I had hoped our Extropians pamphlet would point the way to
what is important. By its very rather unusual nature, and the fact
that people *obviously* felt it was important and worth responding to
-- I hoped people would learn. But our list is dominated by a lot of
intellectual followers.

I want to understand the universe and change the world. After
much thought, I decided both are best served if I pursue science and
put activism on the backburner. It will always be there, and I will
always have the opportunity. But later. And I would leave changing
the world to others if I could.

In the meantime, I am taking: 2.152 (nonlinear control), 7.05
(biochem), 8.059 (quantum 3), 8.08 (stat mech 2), 8.334 (grad stat
mech 2), 9.371 (differential geometry), 18.702 (algebra 2). Even with
that, I spend 1/3 of my time during the week learning and thinking
about material outside my classes. I work 75-85 hour weeks, about 2/3
of which I consider useful work. Of that, maybe 1/4 is really
insightful learning. But you know, even if you plod, you're still
soaking information.

There is so much I want to understand! I wish I could do
everything, learn everything. I can't, dammit, I'm strapped by my
body, mind, mortality. Damn it, curse it, but restrained I am, most
of all by my willpower.

So how well have I been doing? Fairly well. My problem sets
have been decent. It takes getting used to, but you can learn to pump
5-7 out per week. What I really regret is every hour I waste not
reflecting more deeply about the science, not making the insights or
connections that really bring clarity to what I'm doing. I resent the
hours when I'm taken away from one subject to work on another. Right
now, I'm stretched thin, and need to scale down my projects. This
means I need to focus on *only* classes for the next week, rather than
try to understand measure theory, etc. :)

My life began 3 weeks ago. Until then, I wasn't living. And
I knew it.

But now it's for real, and god, it's so exciting, so scary.
What if it's all for nothing? What if knowledge is just a lot of
pretty theory? What if the greatest science needs breakthroughs
(especially mind science) that won't happen in my life? What if I'm
just a nothing? I already know I can be a great pedagogue, a great
teacher -- but what about research? Scientists aren't remembered for
pedagogy. What if I'm 40 and I know the reason I failed was because
of a lack of will -- how could I live with myself then? Or if I'm in
the wrong field, and wasted 20 years on an illusion or lesser field?
What if I should have done something else? Writing? Activism?

Every person must go through this and come to grips with it
alone. Ultimately nobody can possibly help you.

So take heart, yon Extropians, the struggles are real - but
you wouldn't want to live any other way.

-Han

*--Forwarded message follows-----------------------------------------*

Received: from SOUTH-STATION-ANNEX.MIT.EDU by po8.MIT.EDU (5.61/4.7) id AA28708; Thu, 26 Feb 98 11:01:51 EST
Received: from LOCKPICK.MIT.EDU by MIT.EDU with SMTP
id AA24844; Thu, 26 Feb 98 11:01:48 EST
Received: by lockpick.MIT.EDU (5.x/4.7) id AA09925; Thu, 26 Feb 1998 11:01:48 -0500
Date: Thu, 26 Feb 1998 11:01:48 -0500
Message-Id: <9802261601.AA09925@lockpick.MIT.EDU>
From: Han <hhuang@MIT.EDU>
To: extropians@MIT.EDU
Subject: inspiration, motivation, major achievements in science and engineering

[ This email is a mouthful, but hey, I put some good stuff in it, so
I hope you get something out of it. ]

Thanks for the responses, guys. Two people emailed me already
on the main list asking what happened 3 weeks ago. :)

Actually, Ed, I did mean a second order phase transition. :)
-moderated and -discussion slowly differentiate and the system would
undergo "spontaneous symmetry breaking" because I'd only send good
stuff to -moderated. But hey, first-order (discontinous) works, too.

Funny, "phase transition" is not found in the index of
Strogatz's nonlinear dynamics text. And not once in 8.334 (which Ed
and I are taking) did I hear anyone say a phase transition is a
bifurcation in the Landau free energy Psi(m) function.

Jason asks: Who's Landau? Geez, Jason. :) All I've been
learning in the first 3 weeks of 8.059 are Landau levels in the
presence of a magnetic field and in 8.334 about the Landau free energy
function. :) I'm not exagerrating. Landau probably contributed more
to mean field theory, and thereby to modern statistical mechanics'
critical phenomena and condensed matter physics, than any other
physicist. But Ed was referring to Landau's pedagogy. Here, I'll
quote from Gasiorowicz, p. 503, "The [Quantum Mechanics] book by
Landau and Lifschitz is one of a series of superb books covering all
of theoretical physics." It's the only series of its kind. Granted,
other physicists also left a solid written legacy. Feynmann's justly
famous _Feynmann Lectures_ and several textbooks (among them stat
mech, QED) and even lectures on computation and gravitation. Steven
Weinberg wrote textbooks on quantum field theory and general
relativity, the two as yet irreconciliable theories in physics, texts
used in MIT's grad classes.

I've been going off on this to illustrate a point about how
fame is distributed in science, and to ask how important is it really.
You can't ignore these issues. In my physics classes of 1998,
probability says some person sitting in them is eventually going to
win the Nobel Prize. Who is he? How does he learn or research that's
different from anybody else? And if I were to take enough biology
classes, I could say the same, more by sheer mass of # of people
taking these classes, and the fact that MIT is #1 in molecular
biology. These things matter. We work for idealism, learning,
knowledge, but don't tell me you chose your scientific or engineering
applications without a thought of personal ambition.

Landau was an excellent case point. "The great Russian
physicist Lev Landau," as my excellent 8.059 lectures notes say, but
how many people outside physics know this? Similarly, Murray
Gell-Mann. We didn't even mention names like Murray Gell-Mann in our
Extropian pamphlet. Yet, even if particle physics is one of highly
competitive endeavors where a lot of people are working on it at any
time, Gell-Mann is still the man generally most given credit for:
quarks, symmetry, and renormalization theory. This is huge!

And quite unlike what a lot of people say about genius in
their twenties, both of these men did a lot of their great work past
their 30's. Feynmann: QED by age of 31 (path integrals, Feynmann
diagrams), liquid helium and superfluidity mid-30's, weak
interaction/beta decay (which he thought was his greatest work), which
was by 39. In his mid-40's (early 1960's), he did the Feynmann
lectures. Einstein's general relativity: mid 30's.

What does this mean? It means these men were generally doing
their greatest work a full decade or longer after they first started
taking physics classes for real. This is far from the standard myth
of young genius. Folks, it only happens in math, where you have
prodigies, and even then, many a mathematician's greatest work is done
in the years of his full maturity. Norbert Weiner was a child
prodigy, yes, but his greatest opus (which made its way into
Cybernetics) was done when he was sporting a great beard.

(I did an overview this January of famous composers. Brahms
is the only composer who finished a powerful, Promethean work when he
was 25: the Brahms First Piano Concerto (my theme piece). All other
composers did their great work in their 30's or later. Mozart wrote
nothing of major consequence until he was 29 (the 20th piano
concerto). All his great operas were in his early 30's until he died
at 35 having not completed his Requiem (arguably his greatest
non-opera piece). Quite a contrast from the Mozart child prodigy myth
spawn by _Amadeus_, eh? Beethoven finished his 3rd, 5th, and 9th at
34, 38, 53 respectively. Mahler wrote his greatest symphonies 5 and 6
in his early 40's.)

Again, the lesson is clear. Ramanujuan and child prodigy
stories are rare. People often do their greatest creative work after
years, often 10 or more, from their first training. It is true in
composition, it is often very true in writing (but not always; Asimov
and Orson Scott Card's most famous novels were finished for the most
part by late 20's -- but hey, that's almost 10 years isn't it?)

Back to people people who aren't given their proper due...

Arthur Kornberg of biology. DNA Polymerase 1 and 3, RNA
transcriptase. Watson and Crick discovered the double helix
structure, fine, but Kornberg isolated the major enzymes that
replicate DNA and transcribe RNA, surely a greater overall
contribution than Watson and Crick's. From what I can tell in 7.05,
he strikes me as the greatest biochemist/molecular biologist of his
time. Double helix structure, fine...so we know the shape of DNA.
Important, yes. But the runaway success of molecular biology is
entirely due to the amazing enzymes. What much if not most of
molecular biology is, with their gels and whatnot, are basically
throwing enzymes at DNA, RNA. DNA is just the code; it's the enzymes
that are the machinery that make life work, and those machines are so
sophisticated it makes me want to hurl every time I see the word
"nanotechnology." All the "telomerase" and other stuff you're hearing
about: enzymes. How many people even recognize the name? Arthur
Kornberg. Did his Nobel-winning work in his late 30's. (There are at
least a half dozen other molecular biologists I'm not giving credit
to.)

Hodgkin-Huxley: mathematical model for action potentials in
the nerve. It's the *only* successful neurocomputational model so far
after all these decades, and the *only* successful mathematical model
in biology (for which these two won their Nobel Prizes in). Theirs
was a lifetime of dedicated, painstaking research, and the model for
what neurocomputation should aspire to be.

John Bardeen, the only man to have won two Nobel Prizes in
physics, one for the transistor effect, the other for
superconductivity. Damn! That's all I need to say about it.

Steve Smale and other less well-known mathematicians of chaos
theory and differential equations (which is what the field used to be
called before "nonlinear dynamics and chaos" became a big deal).
There are at least two "schools" of dynamic systems, and Smale started
on of them by tying topology to dynamics. It is a very strange fact
that MIT has, to the best of my knowledge, never offered a course in
topological dynamics (!). But then, the story of dynamics is still
very wide open and much remains to be written...

I'll leave with one final topic: estimation, information,
systems, dynamics, control, optimization, operations research. These
fields are so broad and diffuse and interrelated it's hard to
appreciate them in their entirety. Gauss in 1800's invented
deterministic least squares. Fisher in 1910's: invented maximum
likelihood estimation. Wineer in 1940's and for the rest of his life:
Wiener filters, and other stuff that made its way into stochastics.
Kalman in 1960's with the Kalman filter. Kolmogorov in the '60's with
a host of contributions to probability. Shannon, of course, with
information theory. Lyapunov in the 19th century, and other work
later in the '60s (LaSalle, Barbalat, others). A lot of what made
MIT's Course 6 "Area One" famous: the LQG (linear quadratic Gaussian)
formulation which Athans and others are famous for, the canonical
Oppenheim & Willsky textbooks on signals and systems and Oppenheim's
and Shafer's reference textbook on DSP, both of which needless to say,
are enormous industrially.

It's difficult to estimate just how important these fields
are, and therefore how to judge the contributions each of these men
made to humanity. I'll settle on only one example and quote from the
at-times annoying Bart Kosko's _Fuzzy Thinking_ (fuzzy logic) book:

"Pound for pound and bit for bit the Kalman filter belongs
with the light bulb and the radio and the microprocessor in its
engineering benefits for mankind. In many ways it goes beyond these
since it is just a set of equations. Like all math the Kalman filter
is eternal... Yet, most people have never heard of the Kalman filter.
It guides airplanes and space probe and cruise missiles and tracks
satellites and economic trends and changes in your bloodstream. ...
Did you ever wonder how a missile finds its target or how the
astronauts find their way home? The Kalman filter shows them
how. ... Thousands of engineers have published small tweaks and
variants of the Kalman filter."

No kidding! I spent two summers ago running Matlab
simulations on, yes guessed it, a Kalman filter. Jordan (MIT's neural
networks prof) has said similar things about the Kalman filter and
system/information/control variants upon it: that it flies our
airplanes including commercial jets, controls our robots and
spacecraft, fine tunes the nuclear power plants and chemical
engineering power plants, steers our ships, filters our satellite
signals.

Most of Wall Street and almost all of the world's financial
traffic is handled by computer algorithms, most of them in the spirit
of the Kalman filter or recursive estimation in general. If/when I
become a machine intelligence, all my sensors will be run by Kalman
filter-like control mechanisms.

The field of optimization and control makes me wonder at times
if machine learning is what I should do instead of neurocomputation.
I'm learning in my control class that in the last 10 years or so
nonlinear adaptive control has really come into its own.

For example, robots can now do parameter estimation of their
environment (say it wants to swing a bat optimally) that estimates 15
or more parameters in <0.1 seconds! It picks up an object, and before
it's moved it more than a centimeter, it's task-driven learning has
managed to figure out all it needs to. But this isn't that
surprising: a military jetfighter has to fill such performance
criteria as well. Some of those jets would spontaneously explode if
it weren't for sophisticated control mechanisms.

Keep in mind much of the linear or linearization work was done
in the '60s, much of it at MIT. This is not the sexiest of fields,
but it's important work. In case you didn't know, most of MIT's
prestige is based upon the prominent position its electrical
engineering, signal processing, and control department held after
WWII. Radar, and then the entire electrical engineering (and to a
lesser extent aerospace) collaboration with the giant
industrial-military complex. Wiener's _Cybernetics_ was the most
famous statement of MIT's theoretical contributions to these fields.
_Cybernetics_: "Seminal books...comparable in ultimate importance
to...Galileo or Malthus or Rousseau or Mill" (_New York Times_).

To show how unappreciated these fields are, Francis Crick
recently published an journal article that he thinks the amplifier
model might be really important for neurons, but the gain of our
neural amplifiers must be low or otherwise we'd be going haywire. The
man has obviously not taken 6.003 -- he doesn't know the first thing
about feedback control!

CONCLUSION:

Anyhow, I kind of got sidetracked by my examples. But back to
motivations, ultimate goals. I basically want to understand the
universe, whatever is most fascinating, and I'm blessed/cursed enough
to find a great many things really nifty.

But I am going to die, and it really forced me (and Jason) to
think hard about this. The Extropians pamphlet and whatever followed
was an outcome of this angst. It's taken me through a hell of a half
year, but now I'm back to doing science. Any why? For a lot of
reasons.

As for how ambitious I am, and as for the issue of pedagogy
and how much it contributes to humanity, well, at the least, I will be
a good professor. If I'm no good at research, or if for some reason
I'm disenchanted by science, I will write textbooks and maybe activist
screeds as well. It takes, what, 3 years to write a really good
textbook? Maybe up to 5 for the hardest fields. Well, I've got a lot
of years left in me before I kick the bucket. :) I hope the
biologists keep making breakthroughs in life extension.

But wait on that promise...textbooks are usually written by
people in their 40's or later, and there's a reason for that.

Dammit, a third of my waking hours, I'm cursing myself out for
being fucking slow or lazy. But others went through the same thing...
And some of them lived significant, excellent lives...so this shall
content me, too. :)

-Han

----- End of forwarded message from Han -----