Economics, posthumanity, and self-replication (was: MORALITY)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Nov 09 2001 - 06:07:48 MST


Anders Sandberg wrote:
>
> I'm not discussing the posthuman world here, but the world of tomorrow.
> While people do worry about scenarios where superintelligent grey goo
> from the stars eats humanity, the most powerful impact is with the idea
> that within a few years you *must* get yourself a better brain in order
> to avoid becoming poor and powerless. That slots neatly into a lot of
> political pre-programming and gives us a ready-made organized political
> resistance to our projects. To worsen things, most people are used to
> ideologies that prescribe the same behavior for all humans, and do not
> accept the idea that other humans may have other goals.

The real answer here is simply "Are these technologies self-replicating?"
Non-self-replicating technologies will likely be expensive and limited to
the First World, at least at first. Self-replicating technologies are not
expensive unless an enforceable patent exists. Genetic engineering is
self-replicating but patented. Software is self-replicating but
unenforceably copyrighted. Nanotechnology is self-replicating at
sufficiently advanced levels. Intelligence enhancement technology is
"self-replicating" if there's even one philanthropist among the enhanced,
although it may take a while for that process to complete. And the
nanotech-plus-Friendly-SI scenario is not only self-replicating, it
bypasses the existing economic system completely.

I think that's the counterslogan: "Posthumanity will bypass the existing
economic system and offer everyone the same opportunities." Then you have
to argue it. But it makes a good opening line.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:18 MDT