From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun May 25 2003 - 03:34:24 MDT
Spike wrote:
>
> The more I think about it however, the more doubts I have
> about my own argument on that. Eliezer's notion has
> considerable power: any logical growing AI will naturally
> calculate its theoretical upper bounds, then scramble like hell
> to achieve that level. This would mean stopping the
> absurd waste of energy pouring out into cold dead space
> from the stars.
Not right away, I think. When everyone's still too young to need an
MBrain, I suppose you could leave the Sun on a few thousand years longer,
for the romance of it. But, yeah... it's a hell of a waste. Maybe if
there are unlimited resources Elsewhere, the stars would stay on.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sun May 25 2003 - 03:44:51 MDT