ExI = Truth First ?

Ian Goddard (igoddard@netkonnect.net)
Sat, 16 May 1998 07:35:51 -0400

At 05:43 PM 5/15/98 -0400, Daniel Fabulich wrote:

> ...your critique of utilitarianism in favor of "truth" USES
>utility as its criterion of moral value. If using a one-to-one
>truth function brings about optimal utility, as you argue, then
>by all means we should adopt it; but this is BECAUSE of
>utilitarianism, not in spite of it.

IAN: Your point is that function = utility.
But we could have two functions, (1) a truth
fuction that maps physical reality, and (2) a
Big Lie function that masks physical reality.
Both are useful to some, but only one is true.
If "utility first" is my directive, I can choose
either; if "truth first" is my principle I can
choose only one. This limits my options, but
as a rule it maximizes social outcomes, which
is utilitarian, but utility comes second.

What is more, since there is only one reality,
yet myriad claims about it, the Big Lie function
will be useful to most people and thus the demand
for the utility of BLF will be greater than for
TF, so if our directive is "whatever works to
promote your idea," Big Lies will win the day.

When I apply a "truth first" principle based
up a scientific defintion of truth, I submit
to a "higher authority" that will judge my
ideas (about any topic) accordingly. If I
apply "utility first" then it's whatever
I can get away with to promote my ideas.

><thought experiment> Suppose you are studying an important effect in
>quantum mechanics, but one which can be put off until later without
>significant losses in utility. Then, you look out the window and you
>see: A Drowning Child (tm) [the philosopher's favorite play toy!].
>You could save that child now and study quantum effects later,
>OR you could just ignore the child and continue the pursuit of the
>one-to-one truth function. Despite the fact that utilitarianism demands
>that you save that child, truth-first demands that truth comes first, and
>utility second. You ignore the child, and finish observing your quantum
>effect before even considering saving him/her. </thought experiment>

IAN: An interesting dilemma, but I think it
may be a false dilemma. Maybe I cannot swim,
and I have no rope, or just don't care, so
saving the drowning child is not useful to
me, afterall, that kid was a real nuisance.

So it doesn't follow that utility dictates
that I stop my work and save the child.

But if I have an axiom, "All human life is
sacred," and I see a life in eminent peril,
I say "I believe that axiom is true, thus
I must stop my work and save that child."
It seems to me that the act of stopping my
work to save another must rest on a truth.

>> The Nazi Party found the extermination of all who
>> disagreed with their agenda very useful and adhered
>> faithfully to the utility-first principle that many
>> on the ExI list adhere to. Does ExI = Nazism? I say
>> no! I believe it is not the right ExI principle. I
>> believe that to the extent we are directed to that
>> principle we are infected with totalitarian memes.
>To this I answer: Do you honestly believe that WWII was the best thing
>that could have happened to Germany? Wouldn't the German government be
>leaps and bounds better off if that whole little fiasco had never
>happened? Besides, think about all the "Einsteins" which Germany COULD
>have kept, but lost thanks to the Nazi holocaust!

IAN: It is TRUE that Germany would be better off
if they did not start WWII and commit mass murder.
It was useful for the Nazis to mask that truth,
for their goals were not making Germany better
but enemy extermination and global conquest.

Visit Ian W Goddard ---> http://www.erols.com/igoddard
Statements T r u t h A defines -A
a -A defines A
A: x is A b A -A
l T F A set is defined
-A: x is -A e F T by its members, thus
? ? A & -A contain each other.
H O L I S M ---> http://www.erols.com/igoddard/meta.htm