A short list of fallacies

broken table
image from https://www.sportsfreak.co.nz/super-bung-bung/broken-table/

Arguments are always series of claims; a valid argument is one in which the claims are connected. Think of it like a table—if the legs aren’t connected to the tabletop, then the table will fall over. Fallacious arguments are ones that lack legs entirely, or in which they aren’t connected to the tabletop. In most disagreements, we are in the realm of “informal” argumentation; that is, when formal logic doesn’t necessarily help us. Often, what determines whether an argument is fallacious isn’t simply the “form” of the argument, but how it works in context.

Productive disagreements need the people disagreeing (the “interlocutors”) to argue about the same issue, use compatible definitions, fairly represent one anothers’ positions, hold one another to the same standards, and allow each other to make arguments.

There are lists of fallacies that make very fine distinctions, and are therefore very long and detailed—this is a list that seems to work reasonably well for most circumstances.

Fallacies of relevance

A lot of fallacies break that first condition: they are claims that aren’t relevant to the disagreement, but they are inflammatory. They either distract people into arguing about irrelevant topics or else shut down the argument altogether.

Red herring. Some people use this term for all the fallacies of irrelevance. Red herrings are claims that distract the interlocutors (or observers) from the trail we should be following. The phrase probably comes from a story in which someone drags a red herring across the trail of a rabbit to fool the pursuers. (“red herring”); the claim someone has made is so stinky that people get distracted.

Argumentum ad hominem/ad personum/motivism. Contrary to what many people think, an attack on an interlocutor is not necessarily ad hominem. It’s only ad hominem (or fallacious) if the attack is irrelevant. Attacking someone’s credibility on the grounds that they don’t have relevant authority, accusing someone of committing a fallacy, or pointing out moral failings is not necessarily fallacious, if those factors are relevant. If I say that you shouldn’t be believed because you’re a woman, and your gender is irrelevant to the argument, then it’s ad hominem. Ad hominem often takes the form of accusing someone of being part of a stigmatized groups, such as calling all critics of slavery “abolitionists” or any conservative a “fascist.” Sometimes that derails the disagreement, so that we’re now talking about how to define “socialist,” and sometimes it is so inflammatory that we stop having a disagreement at all and are just accusing one another of being Hitler. A somewhat subtle form of ad hominem is what’s often called motivism; i.e., a refusal to engage an interlocutor’s argument on the grounds that you know they’re really making this argument for bad motives. Sometimes people really do have bad motives, but they might still have a good argument. The problem with motivism is that it’s often impossible to prove or disprove someone’s motives.

Argumentum ad misericordiam/appeal to emotions. As with ad hominem, appeal to emotions is not always a fallacy—it’s a fallacious move when it’s an attempt to distract, when the appeal is irrelevant. All political arguments (perhaps all arguments) have an emotional component—otherwise, we wouldn’t bother arguing. If I argue that something is a bad policy because it will cost one million dollars, I’m appealing to the feelings we have about saving or spending money. If you say it’s a bad policy because it will kill ten children, you’re appealing to feelings just as much as I am. Those appeals to emotion are fallacious if they’re irrelevant (e.g., our current policy costs a million dollars and kills ten children, then the new policy isn’t a change in either factor, so those arguments are probably irrelevant), or if they’re being used to distract from other issues or end the disagreement. If, for instance, I refuse to discuss any aspect of the policy other than cost, or I engage in hyperbole about what will happen if we spend a million dollars, then my argument is a fallacious appeal to emotions. It’s also fallacious if I say that you should vote for me because I have a really cute dog, I’ve had a hard life, I’ll cry if you don’t vote for me—those are all fallacious appeals to emotion. Crying to get out of a traffic ticket is a fallacious appeal to emotions. (And that example brings up the problem that fallacies are often effective.)

Tu quoque/whataboutism. This fallacy is the response that, “You did it too!” It’s fallacious when whether the interlocutor did it is irrelevant. The problem with tu quoque is that, if I’ve lied, pointing out that you lied doesn’t mean that what I said was true. We’re now both liars. Sometimes the fallacy involves false equivalency. For instance, if you and I are running for Treasurer, and I say that you’re a bad candidate because you embezzled, and you say that I embezzled too, that might be fallacious. If you’ve been Treasurer of multiple organizations and embezzled substantial amounts every time, and I once took a pen home for personal use, it’s fallacious (it’s also the fallacy of false equivalency—one argument can be multiple fallacies at once). If I say that honesty is the most important thing to me, and I condemn someone else for lying, and I’m lying in that speech, that I’m lying while condemning liars might be a relevant point. At that point, you might talk about my motives and not be involved in motivism—you can point out that I don’t appear to be motivated to engaging in rational argument.

Appeals to personal certainty/argumentum ad vericundiam/bandwagon appeal. When we’re arguing, appealing to an authority is inevitable. Appeals to authority are fallacious when they’re irrelevant—the site, source, or person being appealed to is not an authority, is not a relevant authority, has not made a claim relevant to the argument. For instance, if I say that squirrels are evil, and my proof is that I’m certain of that (appeal to personal certainty), then, unless I’m a zoologist who specializes in squirrels, my opinion is irrelevant. Appealing to a quote from Einstein would also be irrelevant—while he’s an expert, he was never an expert about squirrels. Quoting Einstein “God does not play dice with universe” does not help in an argument about theism, since he isn’t a theologian, he was refuting quantum physics, and he later changed his mind about quantum physics—it isn’t a relevant claim or made by someone with relevant expertise. Saying that something is true because many people believe it (bandwagon appeal) is another form of appeal to irrelevant authority—many people have been wrong about things before. That many people believe something is relevant for showing it’s a popular perception, but probably not for showing that it’s true.

Fallacies of process

In formal logic (if p then q) a process is valid or not regardless of context, but in informal logic, it’s more complicated, and we often end up having to talk about whether something is a fallacy because there is a way in which the claims are related, but weakly, or not related but might appear so, or they don’t necessarily follow. The notion of whether something necessarily follows is important. The claim that “A caused B” might be true (“Being hungry caused me to eat cookies”), but the two terms aren’t necessarily related—I might have eaten something else. When things are necessarily related, then A always causes B. Fallacies of process involve claiming that B follows from A when it doesn’t.

Binary reasoning. Some people argue that this fallacious way of thinking is behind a lot of fallacies of argument. Binary reasoning is the tendency to put everything into all or nothing categories (black or white thinking). So, a person is either a Christian or a Satanist, Republican or Democrat. Since situations are rarely a choice between two and only two options, putting things into binaries is frequently fallacious.

Genus-species fallacy /fallacy of composition/fallacy of division/cherrypicking. Drawing a conclusion about an entire category (genus) from a single example (species) is a fallacy, or even from a small set of examples. We tend to fall for that fallacy because of confirmation bias, a bias that means we notice (and value) data that confirms what we already believe. We’re also prone to let striking examples mean more than they should, simply because they come to mind (called “the availability heuristic”). An example is useful for illustrating a point, but they rarely prove it. Coming to a conclusion about a large category on the basis of one example is moving from species to genus (fallacy of composition) such as assuming that because the one French person you knew liked tap-dancing, all French people like tap-dancing. The more common fallacy is to move from genus to species (fallacy of division), assuming that, since something is part of a large category, we can assume that it has the characteristics we attribute to that big category. For instance, it’s fallacious to assume that, since the person is French (genus) they love croissants (species). Even if the characteristic is statistically true of the majority in that category (most Americans are Christian), it’s fallacious to assume that the individual in front of you necessarily fits that generalization. Picking only those examples (studies, quotes, historical incidents) that fit your claim is generally called “cherrypicking.”

False dilemma/poisoning the wells. If there are a variety of options, and one of the interlocutors insist there are only two, or insists that we really only have one (because they have unfairly dismissed all the others), then that person has fallaciously misrepresented the situation. “You’re either with me or against me” is a classic example of the false dilemma, especially since “with me” usually means “agree with everything I say.” You might disagree with something I say because you’re “for” me—you care about me, and think I’m making a bad decision.

Straw man/nutpicking. We engage in straw man when we attribute to the opposition an argument much weaker than the one they’ve actually made. We generally do this in one of three ways. First, if people are drawn to binary thinking, then they’re likely to assume that you’re either with us or against us. For instance, if they think a person is either completely loyal to a political party or they’re a member of the “other” party, then they’ll assume that anyone who disagrees with them is a member of the “other” party. (So, if I’m a binary thinker, and a Republican, and you criticize a Republican policy, I might assume that you’re a Democrat and then attribute to you “the” argument I think Democrats make.) Second, we will often unconsciously make an opposition argument (or even criticism) more extreme than it is—you’ve said something “often” happens, but I represent your argument as that that something “always” happens. Third, we will often take the most extreme member of an opposition group and treat them as representative of the group (or position) as a whole—that’s often called “nutpicking” (a term about which I’m not wild).

Post hoc ergo propter hoc/confusing causation and correlation. This fallacy argues that A preceded B, so it must have caused B. Of course, it isn’t always a fallacy—if A always precedes B, and/or B always follows from A, they must have some kind of relationship. The relationship might be complicated, though. While a fever might always precede illness, reducing the fever won’t necessarily reduce illness. Lightning doesn’t cause thunder—they’re part of the same event.

Circular reasoning. This is a very common fallacy, but surprisingly difficult for people to recognize. It looks like an argument, but it is really just an assertion of the conclusion over and over in different language. For instance, if I argue, “Squirrels are evil because they are villainous,” that’s a circular argument—I’ve just used a synonym. Motivism sometimes comes into play here. For instance, I might say, “Squirrels are evil because they never do anything good. Even when they seem to do something good, like pet puppies, they’re doing so for evil motives.” That’s a circular argument.

Non sequitur. This is a general category for when the claims don’t follow from each other. It’s often the consequence of a gerfucked syllogism. Sometimes people are engaged in associational reasoning.


A few other comments.

An argument might be fallacious in multiple ways at the same time. For instance, arguing that anyone who disagrees with me is a fascist who wants to commit genocide is binary thinking, ad misericodiam, motivism, and almost certainly straw man. And, once again, identifying a claim as a fallacy almost always requires explaining how it is fallacious.

Another way of thinking about fallacies is that they are moves in a conversation that obstruct productive disagreement. If you think about them that way, you get a list with a lot of overlap, but some differences.









Citations.
“red herring, n.” OED Online, Oxford University Press, June 2020, www.oed.com/view/Entry/160314. Accessed 15 July 2020.