Conditions that make persuasion difficult

A lot of people cite studies that show that people can’t be persuaded. As though that should persuade people not to try to persuade others.

That isn’t even the biggest problem with those studies. The studies are often badly designed (no one should be persuaded to change an important belief by being told by one person in a psych experiment that they’re wrong). And the studies aren’t generally designed to keep in mind what the research on persuasion does show–that some conditions make it more difficult to persuade people.

I was going to put together a short handout for students about why the paper they’re writing is so hard (an ethical intervention in one of several possible situations, ranging from arguing against the Sicilian Expedition to arguing for retreating from Stalingrad), and ended up writing up a list of the biggest obstacles.

An opposition (i.e., already come to a decision) audience that has:

    • Taken the stance in public (especially if s/he has taken credit for it being a good idea or otherwise explicitly attached her/his ego/worth to the position);
    • Suffered for the position, had someone loved suffer, or caused others to suffer (e.g., voted for a policy that caused anyone to be injured)
    • Equated the idea/position with core beliefs of his/her culture, religion, political party, or ideology (since disagreement necessarily becomes disloyalty);
    • Been persuaded to adopt the position out of fear (especially for existence of the ingroup) or hatred for an outgroup;
    • Is committed to authoritarianism and/or naïve realism (equates changing one’s mind with weakness, illness, sin, or impaired masculinity; is actively frightened/angered by assertions of uncertainty or situations that require complex cognitive processes);
    • Does not value argumentative “fairness” (insists upon a rhetorical “state of exception” or “entitlement”—aka “double standard”—for his/her ingroup);
    • Has a logically closed system (cannot articulate the conditions under which s/he would change her/his mind).

A culture that

    • Demonizes or pathologizes disagreement (an “irenic” culture);
    • Is an honor culture (what matters is what people say about you, not what is actually true, so you aren’t “wrong” till you admit it);
    • Equates refusing to change your mind with privileged values (being “strong,” “knowing your mind,” masculinity) and“changing your mind” with marginalized values (being “weak,” “indecisive,” or impaired masculinity);
    • Enhances some group’s claim to rhetorical entitlement (doesn’t insist that the rules of argumentation be applied the same across groups or individuals);
    • Has standards of “expertise” that are themselves not up for argument;
    • Promotes a fear of change;
    • Equates anger and a privileged epistemological stance.

A topic

    • That results from disagreement over deep premises;
    • About which there is not agreement over standards of evidence;
    • That makes people frightened (especially about threats from an outgroup);
    • That is complicated and ambiguous;
    • That is polarized or controversial, such that people will assume (or incorrectly) infer your affirmative position purely on the basis of any negative case you make (e.g., If you disagree with the proposition that “Big dogs make great pets because they require no training” on the grounds that they do require training, your interlocutor will incorrectly assume that you think [and are arguing] that big dogs do not make great pets);
    • That is easily framed as a binary choice between option A (short-term rewards [even if higher long-term costs] or delayed costs [even if much higher]) and option B (delayed rewards [even if much higher] or short-term costs [even if much lower than the long-term costs of option A]).

cropped-temp.png