From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Feb 12 2003 - 23:04:34 MST
Wei, I don't have as much time to talk about this as I should, but...
You have a very nice definition of decisionmaking as a policy which
estimates the desirability of the multiverse given that the outcome of a
decision process is X, then chooses the option which results in greatest
moral value.
Given this perfectly good definition, which suffices for all conceivable
activities of an intelligence, why do you feel the need to talk about this
additional entity called "probabilities"?
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Feb 12 2003 - 23:07:16 MST