Eliezer S. Yudkowsky writes:
>
> As far as I can see, this totally fails to address my original question. I
That's quite true, because I didn't yet have time to address the
original question. Spend too many hours with mail, already.
> just want to know what you would consider a 'win' scenario. When
> nanotechnology, AI, and uploading are all outlawed, how exactly do you 'pass
> the tight spot' and where are you after you've passed it?
I'm not interested in outlawing nanotechnology, AI or uploading. I'm
sorry you misinterpreted what I've said.
I'm interested in *temporarily* imposing enforcible regulations on: 1)
engineering biological pathogens suitable for warfare and terrorism 2)
potentially forthcoming devices based on potentially forthcoming new
physics which could potentially destroy the Earth or this spacetime
(fat chance, this is more to keep the list complete, than anything),
3) free-environment (this also includes space, since you want the
solar system pristine and not full of nasty grey goo which can eat
through a hull or suit in minutes) capable self-rep machine-phase
mechanosynthetic systems and 4) AI of about and beyond human
capacity. Insect, rodent and even chimp grade AI is ok, and indeed
welcome, but they must be restricted in their computational resources
available to them and be not allowed to fool with their own works, as
well as by people by tamper-proof packaging. Maybe not an
electromagnetic gun, but a thermite shell. This might result in
restrictions in amount of computational power available to individuals
outside of restricted research installations. Once again, this is
temporarily, and adaptively negotiable, since a threat needs to be
reevaluated in face of new information. Potentially, we're getting out
panties in a bunch over nothing, but there's no way to tell a priori.
I do not want to impose restrictions on the upload *technology*, quite
the opposite. (If fact I would make it along with developing the
computronium substrate the the top research priority if I had any
political power). The only restrictions which should hold is which
*people* to upload *first*. We need to make sure that these people
make self-amplification and self-enhancement a very, very low
priority. These means that, similiar to astronauts, these people must
be carefully selected, since they're going to help us with uploading
the rest. The strategy should be to haul over as many people as we can
as quickly as we can with the very best fidelity we can. Preferably
for free, which means that the costs must fall several orders of
magnitude, because the first uploads will be *expensive*.
Obviously, during the conversion process the uploader rights need to
be restricted to prevent a runaway resulting in instant Singularity,
until the conversion is complete. (If this is not safely
implementable, we must drop this requirement. These restriction things
tend to backfire nastily, so we must trim them to the barest
essentials).
After the exodus is complete but for those recalcitrant curmugeons who
absolutely, positively don't want to and deal with all negotiators
using firearms as arguments, all restrictions are lifted, and off we
go. No one knows what will happen next, but I do definitely do not
want to be in the shoes of those who are left behind, because through
the decision to stay they've rendered themselves extremely powerless
and vulnerable. For instance, if the (already anticipable) need arises
to disassemble the Earth, they're then left absolutely at mercy of
diassemblers. Depending on their ethics and state of development and
whether others will intervene (my current self certainly would want
to), they may or may not upload them by force, or consider them just
as unremarkable part of the planetary surface resource. (Well, okay,
maybe I am evil).
My current self would want to team up with a group of competent
like-minded, leave Earth surface as soon as possible (unless the means
of travel are unavailable, or impose a critical delay), and head for
the Moon or any suitable nearby space rock. I would not be interested
in exponential self-amplification, but certainly in self-improvement,
anticipating impending need to compete with other streamlined people
and beings. Clearly, my current self can't speak for my future self no
more than my 1 year old can speak for the adult me. Your mileage may
vary.
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:15 MDT