You’re the one with epistemic crisis

cicular reasoning works because circular reasoning works

For many years, I had a narrative about what makes a good relationship, and I had a lot of relationships that ended in exactly the same kind of car crash. I decided, each time, not that my narrative about relationships was wrong, but that I was wrong to think this guy was the protagonist in that narrative. I kept telling myself that I wasn’t wrong about the narrative; I was just wrong about the guy.

In fact, I was wrong about the narrative. When I changed the narrative, I found the guy.

We all have narratives, we have explanations as to how things happen, how to get what you want, how political figures operate, how dogs make decisions. And, as it was with me, it’s really easy to operate within a narrative without question, perhaps without even knowing that we have a narrative. I didn’t see my narrative about relationships as one of several possible ones, but as The True Narrative.

Relationship counselors often talk about how narratives constrain problem solving. Some people believe that people come to a relationship with stable identities—you get into a box (the marriage), and perhaps it works and perhaps it doesn’t. Some people have a narrative of a relationship being at its height when you marry, and it goes downhill from there. Some people believe that a relationship is a series of concessions you make with each other. Some people think that marriages are an authoritarian system in which the patriarch needs to control the family. Some people see a relationship as an invitation to go on a journey that neither of you can predict but during which each of will try to honor one another. There are others; there are lots of others. But I think it’s clear that people in each of those narratives would handle conflict in wildly different ways. People with the “people in a box” narrative would believe that you either put up with the other person, or you leave. People with the “concessions” narrative would believe that you try to negotiate conditions, like lawyers writing a contract. Patriarchs would believe that the solution is more control. Our narratives limit what we imagine to be our possible responses to problems.

If you ask people committed to any of those narratives if those narratives are true, they’ll say yes, and they’ll provide lots of evidence that the narrative is true. That evidence might be cultural (how it plays out in movies and TV), it might be arguments from authority (advice counselors, pundits, movie and TV plots). Or they might, as I would have, simply insist I was right by reasoning deductively from various premises—all relationships have a lot of conflict, for instance. That this relationship has a lot of conflict is not, therefore, a problem—in fact, it’s a good thing! (Think about the number of movies, plays, or novels that are the story of a couple with a lot of conflict who “really” belong together, from Oklahoma to When Harry Met Sally.)

If I thought of myself as someone who had relationships that ended badly because I got involved with the wrong person, I didn’t have to face the really difficult work of rethinking my narrative. And I was kind of free of blame, or only to blame for things that aren’t really flaws—being naïve, trusting, loyal. I could blame them for misleading me, or take high road, and say that we were mismatched.

If, however, I looked back and saw that I kept getting involved with someone with whom it could not possibly work because I kept trying to make an impossible narrative work, then the blame is on me. And it was. And it is.

I don’t really want to say what my personal narrative was, although I’ll admit that Jane Eyre might have been involved, but it was the moment that I stopped reasoning from within the world of that narrative and started to question the narrative itself that I was able to move to a better place. I had to question the narrative—otherwise I was going to keep getting “duped.” (That is, I would keep making only slightly varied iterations of the same mistakes which I would blame on having been misled by a person I thought would save me.)

Our current cultural narrative about politics is just as vexed as my Jane Eyre based narrative about relationships. We are in a world in which, paradoxically, far too many people all over the political spectrum share the same—destructive—narrative about what’s wrong with our current political situation. That narrative is that there is an obviously correct set of policies (or actions), and it is not being enacted because there are too many people who are beholden to special interests (or dupes of those special interests). If we just cut the bullshit, and enacted those obviously right policies, everything would be fine. Therefore, we need to elect people who will refuse to compromise, who will cut through the bullshit, and who will simply get shit done.

This way of thinking about politics—there is a clear course of action, and people who want to enact it are hampered by stupid rules and regulations–is thoroughly supported in cultural narratives (most action movies, especially any that involve the government setting rules; every episode of Law and Order; political commentary all over the political spectrum; comment threads; Twitter). It’s also supported deductively (if you close your eyes to the fallacies): This policy is obviously good to me; I have good judgment; therefore, this policy is obviously good.

It’s more complicated than that, of course. Staying within our narrative doesn’t look like it’s limiting options. It feels rational. The narrative gives us premises about behavior–if you think someone is a good man, then you can make a relationship work; the way to stop people from violating norms is to punish them; high taxes make people not really want to succeed–and we can reason deductively from those premises to a policy. If the narrative is false, or even inaccurately narrow, then we’ll deliberate badly about our policy options.

But what if that narrative—there is a correct course of action, and it’s obvious to good people—is wrong?

And it is. It obviously is. There is no group on any place on the political spectrum that has always been right. Democrats supported segregation; Republicans fought the notion that employers should be responsible if people died on the job because the working conditions were so unsafe. Libertarians don’t like to acknowledge that libertarianism would never have ended slavery, and there is that whole massive famine in Ireland thing. Theocrats have trouble pointing to reliable sources saying that theocracy has ever resulted in anything other than religicide and the suppression of science (Stalinists have the same problem). The narrative that there is a single right choice in regard to our political situation, and every reasonable person can see it is a really comfortable narrative, but it’s either false (there never has been a perfect policy, let alone a perfect group) or non-falsifiable (through no true Scotsman reasoning).

This narrative—the correct course of action is obvious to all good people—is, as I said, comfortable, at least in part because it means that we don’t have to listen to anyone who disagrees. In fact, we can create a kind of informational circle: because our point of view is obviously right, we can dismiss as “biased” anyone who disagrees with us, and, we thereby never hear or read anything that might point out to us that we’re wrong.

If we’re in that informational circle, we’re in a world in which “everyone” agrees on some point, and we can find lots of evidence to support our claims. We can then say, and many people I know who live in such self-constructed bubbles do say, “I’m right because no one disagrees with me because I’ve never seen anyone who disagrees with me.” And they really haven’t—because they refused to look. When we’re in that informational circle, we’re in a world of in-group reasoning. We don’t think we are; we think we’re reasoning from the position of truth.

But, since we’re only listening to information that confirms our sense that we’re right, we’re in an in-group enclave.

It’s become conventional in some circles to say that we’re in an epistemic crisis, and we are. But, it’s often represented as we’re in an epistemic crisis because they refuse to listen to reason—meaning they refuse to agree with us. We aren’t in an epistemic crisis because they are ignoring data. We are in an epistemic crisis because people—all over the political spectrum– reason from in-group loyalty, and no one is teaching them to do otherwise. We live in different informational worlds, and taking some time to inhabit some other worlds would be useful.

More useful is the simple set of questions:
• What evidence would cause me to change my mind?
• Are my arguments internally consistent?
• Am I holding myself and out-groups to the same standards?

Our epistemic crisis is not caused by how they reason, but how we do.

People teaching argument need to stop teaching the rational/irrational split

If you had said to a theologian in the era when Aristotle was considered the authority that, perhaps, the substance v. essence distinction was not useful, you might have found yourself with burning wood at your feet. You certainly would not have been popular. Yet we now think it was a thoroughly useless distinction—meaning we now think they never needed to make it, and that they only did so because they thought it was important to Aristotle, and he was The Authority, and working within that odd binary was what you did.

We now consider the substance/essence binary kind of a joke since it really only made sense within Aristotelian physics, which was wrong.

Scholars and teachers of writing can sit smugly in our chairs and smirk at those dumb people who worked so hard to make things work within what we now see as the false binary of substance v. essence, while we work, write, teach, and assign textbooks that work just as hard to promote the equally false binary of rational v. irrational.

You can tell it’s a false binary by asking someone to define what it means to be “rational.” They will describe five wildly incompatible ways of determining rationality:
1) the emotional state of the person making the argument (whether they seem emotional);
2) which is determined by linguistic cues, such as what linguists call boosters (words like “absolutely,” “never”)—generally whether the tone of the argument seems to the reader more extreme than the argument merits;
3) whether the argument “appeals to” data, “logic” (this is generally bungled);
4) whether what they say is obviously true to reasonable people;
5) whether the argument appeals to expert opinion (or the author is an expert).

These five criteria for determining rationality are, loosely, the person making the argument strikes us as rational kind of person, whether they’re emotional about the issue, whether they have data, whether what they say seems true to the reader, whether there are experts support the claims.

Those are all useless ways of trying to figure out whether an argument usefully contributes to deliberation about any issue.

Granted, those are the characteristics common usage dictionaries identify, although in a different order from mine. Dictionary.com provides this definition of rational:

1. agreeable to reason; reasonable; sensible: a rational plan for economic development.
2. having or exercising reason, sound judgment, or good sense: a calm and rational negotiator.
3. being in or characterized by full possession of one’s reason; sane; lucid: The patient appeared perfectly rational.
4. endowed with the faculty of reason: rational beings.
5. of, relating to, or constituting reasoning powers: the rational faculty.

And every side (there aren’t just two) says that the problem is that our public discourse is irrational, by which they mean the other side is irrational. That’s irrational twice over—they reduce the complicated world to us v. them, which is irrational, and in that irrational argument, they accuse the other side of being irrational, based on a definition that is irrational. We are in a culture of demagoguery because we believe that there is a binary of rational/irrational, and we think that people who are irrational don’t really need to be taken into consideration when we’re arguing about policies. In fact, they shouldn’t be allowed to participate. We believe that democratic deliberation requires that only people on the rational side of the rational/irrational split really count.

The rational/irrational split is not only a false dilemma, but a thoroughly incoherent and profoundly demagogic way to approach any decision. We are in a culture of demagoguery not because they are irrational (from within the that false rational/irrational split) but at least partially because we (all over the political spectrum) accept that false and demagogic binary of rational v. irrational.

Far too often, we assess arguments as rational or not on the basis of whether the person making the argument seems like a rational kind of person, they’re making the argument with an unemotional tone, whether they have evidence, whether what they say seems true to us, and whether the person speaking can cite authorities.

And we don’t always require that last one. We often treat argument from personal experience as rational evidence, especially if it’s our experience.

For instance, since I have the bad habit of reading comment threads (I know, I really should stop), I ran across a comment on a thread about why you should be hesitant to call the police if you have POC neighbors who get on your nerves, and one commenter said something along the lines of, “I’m a 60-year old white woman who has never had any issues with the police.”

I noticed that comment in particular because I’m a 60-year old white woman who has never been badly treated by the police, and I know so many POC who have, and therefore the experience of someone like me is proof that there is disparate treatment of white women and POC. So, I thought her comment would go in that direction. But it didn’t. Instead, she went on to something like, “So, you just have to treat them with respect.”

It’s important to note that she was using her personal experience to discount the personal experiences of POC who report problems with the police. So, her one argument from personal experience—that they treated her well—was, she thought, proof that they treat everyone well. She was treating herself as an expert, on all experiences with the police.

That’s irrational. But it isn’t irrational because she’s an untrustworthy person, she was emotional in the moment, she failed to provide evidence, or what she was saying would come across as obviously untrue to everyone. Her argument would look rational to someone like her, and to someone who thought as she did.

But it’s a really bad argument. Her experience as a white woman doesn’t refute the claim that POC are treated differently by police than are white women.

Her argument is irrational, but not by the dominant way of thinking about what makes a rational argument. The rational/irrational split is just another instance of confirmation bias—if you agree with the argument she’s making, then her argument will seem rational. If you don’t, it won’t.

I agree that democratic deliberation requires that people take on the responsibilities of rational argumentation, but rational argumentation isn’t about false binaries regarding identity, affect, evidence, truth, or expertise. It’s never about feelings v. emotion, so it isn’t about calm or angry, nor is it about data or not.

People teaching argumentation need to run screaming from the rational/irrational split, and from textbooks and teaching methods that reinforce it.

There are scholars who set high standards for rational argumentation, and others who set low standards. I’m on the low standards side: people are engaged in rational argumentation when we
1) can be very specific about the conditions under which we would change our minds—in other words, what we believe is open to falsification;
2) have internally consistent arguments (that is, basically, we have the same major premises for all our arguments);
3) hold the opposition(s) to the same standards in regard to kinds of proof and logic as we hold ourselves. Thus, if cherry-picking from Scripture proves we’re right, then cherry-picking from Scripture proves we’re wrong. If a single argument from personal experience proves we’re right, then a single argument from personal experience proves we’re wrong. Arguments from Scripture or personal experience aren’t necessarily rational or irrational—but how we handle them in an argument is.

This way of thinking about what makes a rational argument means we can’t assess the rationality of an argument without understanding the argumentation of which it is a part.

An argument—a single text—can’t rationally be assessed as rational or not on the basis of just looking at that single text.

Or a single personal experience. If you think about rational argumentation this way, then things like arguments from personal experience are part of the deliberation, and they are datapoints we have to assess just as we would a study. If there is a study that contradicts a lot of other studies, we don’t immediately assume it’s right, nor do we immediately assume it’s wrong. We look at its methodology, relevance, quality relative to the other studies; we look at whether it’s logically relevant to the case at hand.

We treat personal experience the same way. A white 60-yo woman who has always had good experiences with the police is a datapoint. One that shows that white women are treated well by the police. It shows nothing about POC experiences with police.

I think it is useful to characterize arguments as rational or irrational, or, more accurately, to talk about the ways in which they are rational and irrational (since many arguments are both). But, dismissing an argument as irrational simply on the grounds of surface features of a text (the argument is vehement, contradicts what we believe) or purely on in/out-group grounds (the source is irrational because out-group, it contradicts beliefs I think are true), or categorizing the argument as rational because of surface features (it has data, it seems calm, it makes gestures of fairness, it cites experts) or purely on in/out-group grounds (it confirms what I believe, the person seems in-group)–that’s irrational.

[The image is from Modern Dogma and the Rhetoric of Assent, as this is all thoroughly grounded in Booth’s argument.]