Re: META: Dishonest debate (was "cluster bombs")

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Jun 14 2003 - 19:00:28 MDT

  • Next message: Spike: "RE: ENERGY: Singularity on hold?"

    Harvey Newstrom wrote:
    > Eliezer S. Yudkowsky wrote,
    >
    >>Harvey Newstrom wrote:
    >>
    >>>This tactic can never add information to a debate, while it can
    >>>frequently lead to misunderstandings and accusations of
    >>>misrepresentation. I maintain that this source of information is
    >>>unsupportable and should not be used in a rational debate where the
    >>>person in question is available for comment if they so choose.
    >>
    >>I agree; but this is because negative evidence is weak and divided among
    >>many possible explanations, not because taking the negative indicator as
    >>evidence is a logic error.
    >
    > Agreed about the negative evidence.
    >
    > But what do you call it when somebody uses this evidence that is "weak and
    > divided among many possible explanations" and claims a conclusion as being
    > "strong and pointing to only one explanation"? That is where the logic
    > error comes in. The proven "conclusion" is not as strong as is claimed, and
    > does not prove the single solution as claimed. Am I invalid in calling this
    > "conclusion" a logic error? (I.E. the logical proof fails to support the
    > conclusion claimed as proven.)

    Aside from quibbles of terminology, I agree. The indicator is evidence,
    as virtually everything is, but that evidence does not support their
    conclusion.

    The reason I'm quibbling is because of statements like "that source of
    information is unsupportable and should not be used", as opposed to "that
    conclusion does not follow". You can *always* use information. If you
    are a Bayesian you *must* use information; you cannot refuse to revise
    your posterior probabilities just because you don't feel like it. And
    virtually *everything* is information. The world is full of information.
      Some of it is just much weaker than other information.

    -- 
    Eliezer S. Yudkowsky                          http://singinst.org/
    Research Fellow, Singularity Institute for Artificial Intelligence
    


    This archive was generated by hypermail 2.1.5 : Sat Jun 14 2003 - 19:09:22 MDT