The Logical Detection Kit

I have kept a sort of electronic scrap-book over the years, now full of dead hyperlinks, unsourced screengrabs and ‘big’ files of 720 kb that used to tax my computers something horrible.  One of the things I filed away there was a very useful list of principles for testing the soundness of arguments.  Its called the Logical Detection Kit, and looking at it now with a few PhD years on secret visitors under my belt, it seems more useful than ever.

I can’t remember where I got it from, but there is a slightly more wordy version on an Archaeology group based at Yahoo.  I can’t remember whether I got it from there or it was reposted onto another site.  It was posted by ‘Phil’ in August 2000, and according to the preamble was based on an earlier Baloney Detection Kit proposed by Carl Sagan.  It looks pretty much the same, and has been repeated on a lot of places.  Michael Shermer, the well-known popular skeptic writer has even produced a YouTube video which provides a pretty simple language introduction to scepticism and why it is different to denialism.

Both are worth looking at.  If nothing else it begins to give you a vocabulary to help describe what may be a gut feeling that something is not right, that a claim that has no substance is being made.

Baloney Detection Kit

Logical Detection Kit

The following are suggested as tools for testing arguments and detecting  fallacious or fraudulent arguments

1.    Wherever possible there must be independent confirmation of the facts.

2.    Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.

3.    Arguments from authority carry little weight (in science there are no “authorities”).

4.    Spin more than one hypothesis – don’t simply run with the first idea that caught your fancy.

5.    Try not to get overly attached to a hypothesis just because it’s yours.

6.    Quantify, wherever possible.

7.    If there is a chain of argument every link in the chain must work.

8.    “Occam’s razor” – if there are two hypothesis that explain the data equally well choose the simpler.

9.    Ask whether the hypothesis can, at least in principle, be falsified (shown to be false by some unambiguous test). In other words, it is testable? Can others duplicate the experiment and get the same result?

10.    Conduct control experiments – especially “double blind” experiments where the person taking measurements is not aware of the test and control subjects.

11.    Check for confounding factors – separate the variables.

Common fallacies of logic and rhetoric

12.    Ad hominem – attacking the arguer and not the argument.

13.    Argument from “authority”.

14.    Argument from adverse consequences (putting pressure on the decision maker by pointing out dire consequences of an “unfavourable” decision).

 15.    Appeal to ignorance (absence of evidence is not evidence of absence).

 16.    Special pleading (typically referring to god’s will).

17.    Begging the question (assuming an answer in the way the question is phrased).

18.    Observational selection (counting the hits and forgetting the misses).

19.    Statistics of small numbers (such as drawing conclusions from inadequate sample sizes).

20.    Misunderstanding the nature of statistics (President Eisenhower expressing astonishment and alarm on discovering that fully half of all Americans have below average intelligence!).

21.    Inconsistency (e.g. military expenditures based on worst case scenarios but scientific projections on environmental dangers thriftily ignored because they are not ‘proved’).

22.    Non sequitur – ‘it does not follow’ – the logic falls down.

23.    Post hoc, ergo propter hoc – ‘it happened after so it was caused by’ – confusion of cause and effect.

24.    Meaningless question (‘what happens when an irresistible force meets an immovable object?’).

25.    Excluded middle – considering only the two extremes in a range of possibilities (making the “other side” look worse than it really is).

26.    Short-term v. long-term – a subset of excluded middle (‘why pursue fundamental science when we have so huge a budget deficit?’).

27.    Slippery slope – a subset of excluded middle – unwarranted extrapolation of the effects (‘give an inch and they will take a mile’).

28.    Stifling debate through raising prospect of fear.

29.    Confusion of correlation and causation.

30.    Straw man – caricaturing (or stereotyping) a position to make it easier to attack.

31.    Suppressed evidence or half-truths.

32.    Weasel words – for example, use of euphemisms for war such as “police action” to get around limitations on Presidential powers. “An important art of politicians is to find new names for institutions which under old names have become odious to the public”.

33.    Answer not responsive. The person changes the subject, calls names or makes accusations.

34.    Inconsistency. Contains ideas that contradict each other and are logically incompatible or logically mutually exclusive.

35.    Unintelligible or incoherent.

36.    Poorly organised. Ideas not arranged in a logical progression.

Leave a comment