From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Apr 25 2003 - 14:50:11 MDT
Robert J. Bradbury wrote:
>
> There aren't any "threats" to a distributed replicated intelligence
> (like a Matrioshka Brain) except perhaps a black hole. The only
> weapon they would care about would be intertial weapons -- someone
> throwing a *lot* of matter at them. But it requires one *huge*
> amount of matter to deal with something that masses 10^25 to 10^27 kg
> and is distributed over a volume of space from 30 AU to 3 light years
> in diameter.
What about a thunderbolt singularity?
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Fri Apr 25 2003 - 14:59:21 MDT