On Thu, 25 Nov 1999 email@example.com wrote:
> I'm not sure if his argument works if there is no such thing as Absolute
> Morality. In that case it seems that there is a risk that SIs will
> develop their own goals (just as we do) and that their actions will not
> be beneficial to the human race.
> The worst outcome would be if the SIs are programmed by Eliezer to have
> as their only goal the search for the Holy Grail, that is, Absolute
> Morality. However, smart as they are, they still haven't found it.
> They have to be smarter. And to do that they have to turn all available
> mass into SI computational elements, which means, regrettably, wiping out
> the human race.
We can use some assumptions and observations to put some constraints on this.
If we assume that current theories of physics (e.g. big bang, nucelosynthesis, stellar evolution, etc.) are relatively correct then there is "some" observational evidence that SIs exist.
If SIs exist and have existed for long enough to have utilized our resources (probable) and have not done so, then either: (a) they have no interest in doing so (benefit of omission); or (b) there may be a moral principle guiding them not to do so
(benefit of commission).
So, at least currently things seem to be going in our favor. Since we can get to their level within 50 years, at that point it is more probable that we become part of the community or at least have a fighting chance of defending ourselves should they suddenly take an interest in our resource base.
If instead, we assume that astrophysics needs a lot of adjustment, there are no SIs and we are first, then we only need worry about our own failure to chart a proper course of development.
> ... Then, at the end of a millenia-long development effort
> that consumes half the galaxy and reaches realms of abstraction we can't
> begin to imagine, they finally decide that their is no Absolute Morality.
To consume half the galaxy is going to take a minimum of 100,000 years, probably an order of magnitude more, unless it is a concerted effort by a preexisting population of galaxy enveloping SIs.
> So they all commit suicide. Oops.
Thats a bit pessimistic and unless programmed in seems doubtful. How about -- If no theory-of-absolute-morality-exists then go play for a trillion years?
> From the point of view of those of us who don't believe in Absolute
> Morality, Eliezer's program amounts to building an unachievable goal
> into the SIs, a highly dangerous proposition and one we might well oppose.
Before we go rushing off building nanobots to solve this problem (gotta get a DNA sample from Eliezer first, anyone want to take this on?) perhaps we should wait and see if there are or are not SIs present in the universe. Either way, we would need to do some serious thinking on the present/not-present implications. Present could mean the A.M. is a benevolent one, not present could mean there is an escape hatch for of our universe. Both deserve due consideration.