Persuasion happens

Recently, I heard a really good discussion by a couple of people who do and promote a lot of good research on how people think. And one of them said, “We used to believe that you could change peoples’ minds by presenting them with research, but research shows that isn’t the case, so I don’t believe that anymore.”

He didn’t appear to notice the irony.

A lot of research on persuasion isn’t very good, in that it shows something that Augustine talks about—people are not likely to believe completely different things from listening to one speech. And people in a study who are presented with new information don’t change their minds because they shouldn’t—for all they know, since they know it’s a study, the information is deliberately false. Even the better research on persuasion shows that a lot of people don’t change their minds on issues associated with in-group loyalty on the basis of one argument.   (The one exception is if they are presented with information that the in-group supports a particular position—then they are likely to get their position in line with the in-group.)

But, as in the case above, people do change their minds. Philip Tetlock shows that even authoritarians change their minds—they just deny they did. It’s hard for authoritarians and naïve realists to admit they’ve changed their minds because admitting that they were wrong means admitting that their whole model of judgment is wrong.

And research does have an impact on that process of changing our minds. The short essays in How I Changed My Mind About Evolution have a common theme: people realized that the anti-evolution rhetoric they’d been taught depended on a misrepresentation of evolution. The inoculation technique  —presenting people with a weak version of an argument they will later hear or read—backfired because the authors in that book realized it was a weaker version.

Inoculation is, it seems to me, a particularly unethical strategy when it comes to religious issues, since it’s a violation of Christ’s requirement that we do unto others as we would have them do unto us. And, unhappily, it often results in people rejecting religion rather than rejecting the narrow and bigoted religious ideology that can only survive by misrepresenting its opposition.

For inoculation to be effective, it has to be coupled with either demonization/pathologizing of out-groups (out-group views are so spiritually dangerous or intellectually infectious that you can’t even let yourself listen) or insistence on pure in-group loyalty. If inoculation is promoted in a culture that also emphasizes victimization—the in-group is in danger of being exterminated, and so listening to the out-group is treasonous—then people might not realize they’re being presented with a weak version of out-group arguments.

Inoculation (coupled with demonization/pathologizing of the out-group) isn’t specific to reactionary politics, although, because of “conservatives”’ privileging of in-group loyalty , it tends to work better with people who vote conservative, but one can see it everywhere on the spectrum of political arguments.

Non-conservatives unintentionally enhance the effectiveness of inoculation through various practices: 1) repeating misrepresentation of out-group belief systems (no, conservative Christians are not hypocritical because they cite Hebrew Bible rules about sex and yet reject the rules about shellfish)—just stop that); 2) not knowing the best arguments for the positions we oppose (for instance, not only are there instances of people stopping crimes by having a gun, but gun bans have a complicated consequence ; 3) treating all out-group members as identical; 4) relying on sources that misrepresent their own sources (Blue State, dailykos, and Mother Jones—I’m looking at you).

Projection is also important in persuasion, and one aspect of projection that works well for various in-group enclaves is to condemn others for being in an enclave. Really effective propaganda machines appear to offer both sides, by presenting the audience with the desired political outcome, and then a more extreme version (so segregationists like Boutwell could claim to be reasonable because he didn’t support violence — keep in mind that that stance worked, so that people presented Boutwell’s implacable opposition to integration was reasonable, and King’s position was unwise) All factional media insists that we are getting our information from objective sources; they only consume factional media. And, that we are consuming media that engages in inoculation means we don’t think we are in a bubble. We think we are listening to the other side.

People are persuaded by research. They are persuaded by research they consider valid and that they are persuaded represents the consensus of responsible experts on the subject.

All of those terms–research, validity, consensus, responsible experts–are vexed, and heavily influenced by in-group favoritism, but persuasion happens.

We are all persuaded. The worst thing about our current political situation is that there is so much discourse that says “I have become persuaded that persuasion is impossible, and so we must stop trying to persuade others.”

No. When people are persuaded that persuasion is impossible, they are preparing themselves for violence.

[The image is of Nazis enjoying humiliating Jews on Austria abandoning democracy and joining Germany.]

One thought on “Persuasion happens”

Comments are closed.