Stop calling information you don’t like “fake news.” You’re giving TMI about how you think, and it isn’t good

markers in various shades of green

I lived in Berkeley from the mid-seventies till the mid-eighties, and in that era it had four different communist student groups. One group I thought of as Stalinist (I think they called themselves Leninists)—whatever the USSR did or had done was entirely right. If you pointed out to them that the USSR was doing (or had done) something less than perfect (and, let’s be honest, that was pretty easy to do), they responded by saying, “That’s just capitalist propaganda!” Or, equally often, “Come the Revolution, motherfucker, you’re the first up against the wall.”

They often had facts to support their argument that the US foreign policy was not as liberatory and high-minded as many Americans liked to claim (also pretty easy to have), but they wouldn’t even engage any information critical of the USSR. They refused to look at it, listen to it, or even consider it. They dismissed it on the grounds of it being propaganda just because it was information they didn’t want to hear.

And what’s interesting about that response—that refusal to look at (or listen to) evidence that might trouble their beliefs—is that it showed weakness. It showed that, at some level, they knew their faith in the USSR couldn’t be defended through rational-critical argumentation.

In other words, they were making an interesting admission about the fragility of their beliefs. They couldn’t argue for their position. That isn’t how they thought of it to themselves; they told themselves, “I’m so right that I don’t need to listen to anyone who disagrees.”

But, what does it mean to be so right that you can’t even look at evidence that you might be wrong? If you’re really right, then there’s no harm in looking at that evidence. If you refuse to look at any evidence that you might be wrong, then you’re admitting that your beliefs are rationally indefensible.

All of us have some beliefs that we hold that way— times that we’re both anxious and out of control and decide that knowing bad news wouldn’t make any difference. But some people approach all political issues (or all issues) with that “I believe what I believe, and I’m afraid to look at information that might prove me wrong.”

This way of thinking about an issue is often called blind loyalty, but I think that metaphor is wrong for a lot of reasons. One of them is that people like the Stalinists are perfectly willing to look at information that says they’re right or the other group is wrong. They just won’t look at information problematic for their position. I think it’s better to call it blinkered loyalty, because it’s as though people have put on the kind of blinkers that are put on horses to keep them from seeing anything other than what the rider wants them to see.

In this case, people put the blinkers on themselves, and calling something something “capitalist propaganda” made the Stalinist feel better about wearing the blinkers.

I’m not saying we’re obligated to engage every person who disagrees with us, or to look at every piece of evidence they present—there are people and sources that aren’t worth engaging, such as “fake news” sites. “Fake news” was initially the term used for sites that openly identified themselves as providing fabricated information (if you looked at the whole page, you’d find something saying the site was “satire”).

The research suggested that Trump supporters (I’ll say again, they were far from alone in this) didn’t read the fine print; they didn’t know they were passing along information that was obviously false although they could have if they had looked at the sources carefully. But they didn’t.

The more that you think about politics as a question of loyalty, then the more likely you are to accept as true anything that supports your in-group and reject as untrue whatever is problematic for your in-group simply on those grounds. You’re likely to have blinkered loyalty.

I began this post with the example of a Stalinist because I wanted to emphasize that I don’t think this way of thinking about politics is limited to “conservatives” (and I think the tendency to divide politics in a binary or continuum is false and destructive, but that’s a different post). It also isn’t equally true of “both sides” because the range of political ideologies and commitments in the US is no more usefully divided into “left” and “right” than the world of animals is divided into “up” and “down.”

I think it’s more useful to think of politics in terms of a color spectrum, since that enables us to think about there being more than two political positions, and also different degrees of commitment (how “saturated” a color is—such as the difference between a deep and a pale purple). Some people (such as the “political compass” site) uses the continuum of “authoritarian” v. “libertarian” to describe a similar concept (but I’ve found that people get confused by those terms). People who are more on the authoritarian side of the continuum (the deeply “saturated”) see politics as a question of in-group loyalty, only consume in-group media, change their beliefs not because of new information but because the in-group position has changed, reject the notion that any political position other than theirs might be legitimate, and are comfortable with the government silencing anyone who disagrees.

Because people who heavily saturated only get their information from in-group sources, they are engaged in blinkered loyalty. But they don’t think they are because their media claim to give them “both sides”—the media spends a lot of time saying what “the other side” thinks. It’s all straw man, of course, except what it’s outright misrepresentation. It’s inoculation. The surprising paradox is that, not only are heavily saturated people the most politically engaged, they’re the most uninformed. They’re very informed (armed even) with data or talking points that support the in-group, but they’re actively misinformed about out-group beliefs, and completely uninformed about weaknesses or flaws in in-group arguments, policies, or political figures.

So here is what I’m saying. Calling everything that disagrees with you “fake news” is no more rational than the Stalinist shouting “capitalist propaganda!” Rejecting any source that has disconfirming information, refusing to look at non in-group sources, dismissing anything we don’t want to hear—that’s openly announcing one’s place in the heavily saturated places on the color spectrum, that the position is just blinkered loyalty.


3 thoughts on “Stop calling information you don’t like “fake news.” You’re giving TMI about how you think, and it isn’t good”

Comments are closed.