I get the impression there's some wonderment as to why I'm raising the issue.
While I understand that the ordinary way of doing things is to have
arguments on every possible side of an issue, multiple organizations
with conflicting goals - a situation that some may even associate with
things like "the free exchange of ideas" - I thought I should take at
least a shot at doing what a transhuman would do in this situation; get
a complete consensus by everyone on the correct course of action. I may
not be a transhuman, but we're all supposedly rational people here, and
it does seem to me that our choices here are not a matter of decision
but of perception. There is one unambiguous set of correct actions
under all three major supergoals, and I want to at least take a shot at
getting a consensus on that by all major Extropians and organizations.
In particular, my argument is that the probability of a Sysop Mind
working is at least 30%, while the probability of uploading working for
any given person in the absence of a Sysop, even given the technology,
is no more than 0.1%. den Otter and sayke have had a lot to say about
this, about the glory of individuality and the evil of Big Brother and
their reluctance to give up control, but none of it seems to address the
basic fact that 30% is more than 0.1%. If you open up box A, you have a
30% chance of survival. If you open up box B, you have a 0.1% of
survival. Those are the facts. Everything else is just a side effect,
a context-dependent heuristic. Forget the instincts. Forget the
emotions. Forget the social algorithms. Think like an AI. Which is
more: 0.3 or 0.001? That's it. That's all. Everything else is
irrelevant. You don't need to look inside those black boxes to see the
answer. You just need to know the numbers.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/beyond.html Member, Extropy Institute Senior Associate, Foresight Institute
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:05:07 MDT