From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Jul 25 2003 - 04:49:24 MDT
Lee Corbin wrote:
>
>>when he raised nuclear genocide as a possible rational and
>>*morally preferable* option was the revolted reaction of a
>>friend who, out of politeness, did not wish to say publicly
>>what Eliezer said: `Are you autistic?'
>
> That's a pretty dumb question! Like your probable remarks,
> it does far more to connote a frame of mind (a rather
> deplorable frame of mind IMO) than it does to advance
> rational discussion.
Perhaps it does not advance rational discussion. It was a serious
question, though. The disconnect between the words being verbally
manipulated into sentence streams, and all affect, emotion, intuitive
visualization, was large enough to make me start wondering about Robert's
sanity, or at least such small sanity as humans are supposed to have.
The Hiroshima bomber crew did not, contrary to legend, have nightmares
about it. They flew back, Truman told them that he took responsibility,
and that was that. Can you imagine if they'd had to walk along a line of
seventy thousand people and, one by one, slit their throats - let alone
being forced to burn them in fire? One by one, letting each soldier or
man or woman or schoolchild finish dying before moving on to the next?
They would not have gotten past the first ten, they would have had
nightmares for the rest of their lives, and if Truman told them he
"accepted full responsibility" it would have been a pathetic, useless
token. Dave Grossman is exactly right; technological distance is
emotional distance. Bomber crews don't have nightmares because they wield
buttons instead of knives, though the suffering they cause is vastly greater.
Robert Bradbury has no goddamn idea what the words coming out of his mouth
actually mean. That much is clear. And similarly the people who dropped
the bomb on Hiroshima had no goddamn idea what that button actually did,
regardless of what verbal ideas were floating around in their heads. Hugo
de Garis has no goddamn idea of what a 'gigadeath' 'artilect war' would
involve, and Ben Goertzel has no goddamn idea that AI is an existential
risk, regardless of what words they put down on paper. Being unable to
relate abstract verbal thought to negative affect is not a charming
personal quirk. Robert is just emotionally disconnected from the results
of what he's saying, not evil; he has no idea that he might be doing
something wrong, though he realizes in a distant way that he ought to
acknowledge the possibility. But y'know, by golly, I bet that when all
the bodies are counted up across the galaxies, it's the Roberts and not
the Hitlers who account for most of the planetary kills. There comes a
point where stupidity stops being innocent, and it doesn't really matter
anymore whether you have to be emotionally evil as well as verbally evil
to genuinely be a bad person. I think that line is crossed when you start
talking about the use of nuclear weapons to commit genocide against those
darn foreigners because paying attention would be too inconvenient.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Fri Jul 25 2003 - 05:00:27 MDT