Re: What is to be done?

The Low Golden Willow (
Mon, 24 Mar 1997 21:22:27 -0800 (PST)

On Mar 24, 3:54pm, Robin Hanson wrote:

} philosophers). Not that the conclusions don't follow from the axioms,
} but that most people don't feel very compelled to accept the axioms.
} Thinking that these axioms will persuade very many people also seems a
} sophmoric cognitive blunder, and does not encourage people to take the
} other extreme views of these folks seriously.

"Extraordinary claims require extraordinary proof." The skeptical
mantra. To most people, many of our claims are extraordinary.
Sometimes we can offer deductive arguments for our positions; other
times we have less support than that. But even then we are opposing one
deductive chain of reasoning to several inductive chains of experience.
It is reasonable (i.e., generally works) to assume that our deductive
proof is either wrong somewhere, or else if valid then ignoring some
assumption. That they can't identify it easily isn't important; the
fact that they do have inductive experience countering our argument
strongly suggests that there is a missing factor.

Therefore to get anywhere, particularly in libertarian and
anarcho-capitalist debate, we must not only show how our envisioned
society could work, we must explain why current society is the way it is
and why any exceptions our opponents will come up with don't apply.
Preferably before they come up with the objections, to forestall claims
of "special pleading" which will be made, whether doing so is fair.

This is why David Friedman excites me: he actually tries to take the
current world (and the people in it) seriously, and is thus more
convincing than J. Random Net-libertarian. Robin does too; it seems to
be an application of economist training to libertarian interests.

(Was it this list or the transhumans -- Rich Artym? -- where the utility
and quality of the social sciences were challenged?)

} Here I also have to agree with the critics, at least when they refer
} to people who expect remarkably rapid progress toward transhuman
} technologies in the next few decades. This stuff will take a while to

If you mean Singularity by 2035 (or 2015!) seems extreme, I'd agree
with you. Specifying much else on a multi-decadal scale is more than
I'd want to do.

} I have heard a related criticism that anyone who would especially want
} to become transhuman is excessively selfish. And I think such people
} *are* more selfish than average, but "excessive" is harder to judge.

More selfish, or more openly selfish?

} This highlights another core issue: to what degree does social pressure
} function to stabilize our social order? Economists like me, who tend
} to ignore social pressure in our models, tend to think that the effect
} is minor. But others see social pressure which keeps people acting

This still bothers me. Religion is bloody real; how do economists
account for it? That it's a veneer under which people pursue their
rational goals anyway? I'm willing to grant that this may true to a
larger extent than I'd easily imagine on my own. Still, when I read in
_Hidden Order_ the note that economics assumes rational behavior coupled
with relatively simple goals, I wondered whether some 'social' goal like
"seeking status" or "being 'with' other humans" or something shouldn't
be factored into more speculative models.

Good post, though.

Merry part,
-xx- Damien R. Sullivan X-) <*>

Life: an arms race to entropy.