Ethical Groundings (was: Anders Sandberg's Value System)

Eliezer Yudkowsky (sentience@pobox.com)
Tue, 04 Feb 1997 20:34:33 -0600


Anders Sandburg views complex systems as a basic good.
This reminds me deeply of one ethical system I tried on for size, long
ago: "The basic good is new ideas."

In terms of how this concept actually originated, it's when I tried to
figure out where mental energy came from. I guessed "new ideas" on the
basis that they would stir up new thought-currents and create mental
energy. In retrospect, I was completely wrong; new ideas may create
mental energy but simply by originating new goals which are not worn by
repeated viewed-as-futile efforts, or perhaps simply because gaining new
knowledge is a basic goal and even an evolved warning signal for greater
awareness.

In any case, the more I thought about the "new ideas" goal, the more it
seemed appropriate as an actual Meaning-Of-Life candidate - in
retrospect, not an Interim, but an actual Meaning. Consider that
duplicated information is likely to be no more valuable than a single
copy, and that whatever the Meaning of Life is, it is likely to viewable
on some level as a complex system or informational object. That is,
you'd imagine the Meaning was *something*, and duplicated *somethings*
would be no better than one *something*.

Unlike Anders Sandburg, I view an entire ecology as being essentially
valueless - because the complexity of an ecology is so much *less* than
the complexity of a single human brain that I'd happily destroy a
biosphere (one not supporting any sentient life, of course!) to save a
human life. Besides, ecologies aren't conscious, and whatever the
Meaning of Life is, I strongly suspect that it requires consciousness
(or some similar ontologically engineered substance) as substrate. So
therefore, increasing the number of new ideas held by sentients is the
most likely course to maximizing the Meaning. This was later replaced
by the Singularity on the grounds that knowing what the Meaning was
would be the way to maximize it, but anyway...

Consider the consequences of this ethical system. It urges
individualism, critical thinking, the advancement of science and
technology - and keeping up with the forefront, writing and reading
science fiction, the exploration of new frontiers, and above all the
invention of methods of intelligence enhancement. Very Extropian,
really - especially considering that this system was invented long
before I'd heard of Extropy, although not before reading "Great Mambo
Chicken" and being raised as a Libertarian.

So that's my proposal for a definition of "extropy" as a substance:
New ideas.

-- 
         sentience@pobox.com      Eliezer S. Yudkowsky
          http://tezcat.com/~eliezer/singularity.html
           http://tezcat.com/~eliezer/algernon.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.