J. R. Molloy writes:
> The question posed by this thread intends to discern whether the BCT *should* be
> done. If the BCT (AI/SI) becomes a menace and/or destroys all human life, the
> implied supposition is that it might be better to devise methods of containing
> it (or forestalling it) before it is actually constructed or evolved through
> genetic programming.
Notice that Eliezer does not want to use evolutionary algorithms,
probably because he (rightfully) suspects that the result will not
necessarily be friendly to mehums, in fact is extremely unlikely to be
friendly. His variant is supposed to work (if I get him right) by
engineering the boundary conditions that the thing transcends in a
very controlled way, falling into a metastable regime (being
metastable by virtue of getting there first (which is ascertained by
extremely rapid dynamics of positive autofeedback), and keeping
everybody else from transcending by occupying that niche, until some
external or internal force dislodges it from there), effectively
becoming a pet god.
Of course if you keep rewriting pieces of you in a strictly Lamarckian
fashion, Darwin really really wants to come in through every back door
and hole you might have overlooked.
This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:38:58 MDT