Re: stuff that shouldn't exist into the next century

Eliezer S. Yudkowsky (sentience@pobox.com)
Tue, 14 Sep 1999 08:30:25 -0500

tim cavanaugh wrote:
>
> I'm writing a story for Wired magazine on
> items that should be discontinued in the
> 21st century - a "Leave It Behind" list.
> I'd be interested to know what people in
> this group believe should be abandoned
> for the next millenium. Can be ideas,
> institutions, countries, people,
> companies, religious beliefs, assumptions,
> or anything else you think is so bad,
> obsolete, irrelevant, dangerous or just
> annoying that the human race would be
> better off kissing it goodbye. Please
> take this opportunity to vent your
> rage against the past, the present, or
> the future. Thanks.

I'm considered something of an extremist, but I would have to say I'd favor designing out large parts of the human cognitive architecture and virtually all of human society. So to give a short list of things I'd like to see debugged:

Social bugs:
Corporations - replace with fluid contracts Marketing - replace with collaborative filtering Money - r/w automated barter
Countries
Private property - r/w two-tiered transferable/birthright system. Authority and coercion
War

Biological bugs:
Old age
Pain - needs volume control
Disease
Eating - run off electrical power
Death
200hz neurons, 100 meter/sec axons
Genes
Carbon-based substrate - too flimsy, move to silicon

Cognitive bugs:
Hatred
Racism
Patriotism
Self-preservation instinct
Respect for authority
Existential ennui
Political/social intuitions - make conscious Bipolar sexuality
Limited willpower
Self-awareness - needs augmentation and finer-grained architecture. Individuality - needs extensibility; telepathy and group minds. Stupidity

Children should be born with, and retain, around twice as many neurons. When possible, we need to link the human brain to a computational matrix that can support indefinite growth in cognitive capacity. The brains we got now are WAY too limited.

In fact, I'm leaning towards the proposition that we should define de novo AI cognitive architectures and then transfer the contents of our minds inside. It's sort of like the way transhumanists want to "upload" themselves into computers by transferring an active model of the neurology (and in fact you'd have to do that first), except working with a higher level of substrate.

What should humanity leave behind? Everything. Our genes, our bodies, our societies, our individuality, our cognitive architectures, and maybe even our personalities. The human era is over. Fifty thousand years of pain and death and stupidity; I say we bid it a DAMN GOOD RIDDANCE and make off for greener pastures.

-- 
           sentience@pobox.com          Eliezer S. Yudkowsky
        http://pobox.com/~sentience/tmol-faq/meaningoflife.html
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way