From: J. R. Molloy (email@example.com)
Date: Fri Jan 11 2002 - 08:15:09 MST
From: "Charles D Hixson" <firstname.lastname@example.org>
> You might want to consider the death rate among the civilian
> population of Afganistan before you lean too heavily on that. The
> death rate among US citizens has been, I admit, quite low. (Two?
> I haven't beem following this.)
You're confusing citizens with combatants.
Several thousand US citizens were murdered in the 9-11 attack, and only a few
US solidiers have been killed in the action in Afghanistan. The death rate
among the civilian population of Afghanistan caused by US actions in
Afghanistan is considerably less than that caused by the Taliban, Al Quaida,
and warring Afghan militant groups.
> A large part of the reason for the scarcity of US casualties is
> that the war is being conducted via robots (well, telefactors right
> now). Does anyone need to be told just how insanely dangerous
> development in this direction is?
Well, apparently you need to be told how insanely dangerous religious
fanaticism is, because that's what started the war. Sane people with robots
are not dangerous at all, but religious fanatics can be very dangerous with no
more than rocks as weapons.
> This is almost the exact
> opposite of "Friendly AI".
"Friendly AI" is figment of imagination, fortunately, and the idea of making
AI anything other than purely intelligent is more dangerous than completely
--- --- --- --- ---
Useless hypotheses, etc.:
consciousness, phlogiston, philosophy, vitalism, mind, free will, qualia,
analog computing, cultural relativism, GAC, Cyc, Eliza, cryonics, individual
uniqueness, ego, human values, scientific relinquishment, malevolent AI,
non-sensory experience, SETI
We move into a better future in proportion as the scientific method
accurately identifies incorrect thinking.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:34 MST