How not to make a Hitler analogy

Americans love the Hitler analogy, the claim that their political leader is just like Hitler. And it’s almost always very badly done—their leader (let’s call him Chester) is just like Hitler because…. and then you get trivial characteristics, such as characteristics that don’t distinguish either Hitler or Chester from most political leaders (they were both charismatic, they used Executive Orders), or that flatten the characteristics that made Hitler extraordinary (Hitler was conservative). That process all starts with deciding that Chester is evil, and Hitler is evil, and then looking for any ways that Chester is like Hitler. So, for instance, in the Obama is Hitler analogy, the argument was that Obama was charismatic, he had followers who loved him, he was clearly evil (to the person making the comparison–I’ll come back to that), and he maneuvered to get his way.

Bush was Hitler because he was charismatic, he had followers who loved him, he was clearly evil (to the people making the comparison), and he used his political powers to get his way. And, in fact, every effective political figure fits those criteria in that someone thought they were clearly evil: Lincoln, Washington, Jefferson, FDR, Reagan, Bush, and Trump, for instance.

He was clearly evil. In the case of Hitler it means he killed six million Jews; in the case of Obama it means he tried to reduce abortions in a way that some people didn’t like (he didn’t support simply outlawing them), in the case of Bush it was that he invaded Iraq, for Lincoln it was that he tried to end slavery, and so on. In other words, in the case of Hitler, every reasonable person agrees that the policies he adopted six or seven years into his time as Chancellor were evil. But not everyone who wants to reduce abortions to the medically necessary agrees that Obama’s policies were evil, and not everyone who wants peace in the middle East agrees that Bush was evil.

So, what does it mean to decide a political leader is evil?

For instance, people who condemned Obama as evil often did so on grounds that would make Eisenhower and Nixon evil (support for the EPA, heavy funding for infrastructure, high corporate taxes, a social safety net that included some version of Medicare, secular public education), and many of which would make Eisenhower, Nixon, Reagan, and the first Bush evil (faith in social mobility, protection of public lands, promoting accurate science education, support for the arts, an independent judiciary, funding for infrastructure, good relations with other countries, the virtues of compromise). So, were the people condemning Obama as evil doing so on grounds that would cause them to condemn GOP figures as evil? No—their standards didn’t apply to figures they liked. It just a way of saying he wasn’t GOP.

Every political figure has some group of people who sincerely believe that leader is obviously evil. And every political figure who gets to be President has mastered the arts of being charismatic (not every one gets power from charismatic leadership, but that’s a different post), compromising, manipulating, engaging followers. So, is every political leader just like Hitler?

Unhappily, we’re in a situation in which people make the Hitler analogy to everyone else in their informational cave, and the people in that cave think it’s obviously a great analogy. Since we’re in a culture of demagoguery in which every disagreement is a question of good (our political party) or evil (their political party), any effective political figure of theirs is Hitler.

We’re in a culture in which a lot of media says, relentlessly, that all political choices are between a policy agenda that is obviously good and a policy agenda that is obviously evil, and, therefore, nothing other than the complete triumph of our political agenda is good. That’s demagoguery.

The claim that He was clearly evil is important because it raises the question of how we decide whether something is true or not. And that is the question in a democracy. The basic principle of a democracy is that there is a kind of common sense, that most people make decisions about politics in a reasonable manner, and that we all benefit because we get policies that are the result of the input of different points of view. Democracy is a politics of disagreement. But, if some people are supporting a profoundly anti-democratic leader, who will use the power of government to silence and oppress, then we need to be very worried. So the question of whether we are democratically electing someone who will, in fact, make our government an authoritarian one-party state is important. But, how do you know that your perception that this leader is just like Hitler is reasonable? What is your “truth test” for that claim?

1. Truth tests, certainty, and knowledge as a binary

Talking about better and worse Hitler analogies requires a long digression into truth tests and certainty for two reasons. First, the tendency to perceive their effective political leaders as evil because their policies are completely evil is based on and reinforces the tendency to think of political questions as between obvious good and obvious evil, and that perception is reinforced by and reinforces what I’ll explain as the two-part simple truth test (does this fit with what I already believe, and do reliable authorities say this claim is true). Second, believing that all beliefs and claims can be divided into obvious binaries (you are certain or clueless, something is right or wrong, a claim is true or false, there is order or chaos) correlates strongly to authoritarianism, and one of the most important qualities of Hitler was that he was authoritarian (and that’s where a lot of these analogies fail—neither Obama nor Bush were authoritarians).

And so, ultimately, as the ancient Greeks realized, any discussion about democracy quickly gets to the question of how common people make decisions as to whether various claims are true or false. Democracies fail or thrive on the single point of how people assess truth. If people believe that only their political faction has the truth and every other political faction is evil, then democracies collapse and we have an authoritarian leader. Hitlers arise when people abandon democratic deliberation.

That’s the most important point about Hitler: leaders like Hitler come about because we decide that diversity of opinion weakens our country and is unnecessary.

The notion that authoritarian governments arise from assumptions about how people argue might seem counterintuitive, since that seems like some kind of pedantic question only interesting to eggheads (not what you believe but how you believe beliefs work) and therefore off the point. But, actually, it is the point—democracies turn into authoritarian systems under some circumstances and thrive under others, and it all depends on what is seen as the most sensible way to assess whether a claim is true or not. The difference between democracy and authoritarianism is that practice of testing claims—truth tests.

For instance, some sources say that Chester is just like Hitler, and other sources say that Hubert it just like Hitler. How do you decide which claim is true?

One truth test is simple, and it has two parts: does perceiving Chester as just like Hitler fit with what you already believe? do sources you think are authorities tell you that Chester is just like Hitler? Let’s call this the simple two-part truth test, and the people who use it are simple truth-testers.

Sometimes it looks as though is a third (but it’s really just the first reworded): can I find evidence to show that Chester is just like Hitler?

For many people, if they can confirm a claim through those three tests (does it fit what I believe, do authorities I trust say that, can I find confirming evidence), then they believe the claim is rational.

(Spoiler alert: it isn’t.)

That third question is really just the same as the first two. If you believe something—anything, in fact—then you can always find evidence to support it. If you are really interested in knowing whether your beliefs are valid, then you shouldn’t look to see whether there is evidence to support what you believe; you should look to see whether there is evidence that you’re wrong. If you believe that someone is mad at you, you can find a lot of evidence to support that belief—if they’re being nice, they’re being too nice; if they’re quiet, they’re thinking about how angry they are with you. You need to think about what evidence you would believe to persuade you they aren’t mad. (If there is none, then it isn’t a rational belief.) So, those three questions are two: does a claim (or political figure) confirm what I believe; do the authorities I trust confirm this claim (or political figure)?

Behind those two questions is a background issue of what decisions look like. Imagine that you’re getting your hair cut, and the stylist says you have to choose between shaving your head or not cutting your hair at all—how do you decide whether that person is giving you good advice?

And behind that is the question of whether it’s a binary decision—how many choices to you have? Is the stylist open to other options? Do you have other options? Once the stylist has persuaded you that you either do nothing to your hair or shave it, then all he has to do is explain what’s wrong with doing nothing. And you’re trapped by a logical fallacy, because leaving your hair alone might be a mistake, but that doesn’t actually mean that shaving your head is a good choice. People who can’t argue for their policy like the fallacy of the false division (the either/or fallacy) because it hides the fact that they can’t persuade you of the virtues of their policy.

The more that you believe every choice is between two absolutely different extremes, the more likely it is that you’ll be drawn to political leaders, parties, and media outlets that divide everything into absolutely good and absolutely bad.

It’s no coincidence that people who believe that the simple truth test is all you need also insist (sometimes in all caps) that anyone who says otherwise is a hippy dippy postmodernist. For many people, there is an absolute binary in everything, including how to look at the world—you can look and make a judgment easily and clearly or else you’re saying that any kind of knowledge at all is impossible. And what you see is true, obviously, so anyone who says that judgment is vexed, flawed, and complicated is a dithering weeny. They say that, for a person of clear judgment, the right course of action in all cases is obvious and clear. It’s always black (bad) or white (good, and what they see). Truth tests are simple, they say.

In fact, even the people who insist that the truth is always obvious and it’s all black or white go through their day in shades of grey. Imagine that you’re a simple truth tester. You’re sitting at your computer and you want an ‘e’ to appear on your screen, so you hit the ‘e’ key. And the ‘e’ doesn’t appear. Since you believe in certainty, and you did not get the certain answer you predicted, are you now a hippy-dippy relativist postmodernist (had I worlds enough and time I’d explain why that term is incredibly sloppy and just plain wrong) who is clueless? Are you paralyzed by indecision? Do you now believe that all keys can do whatever they want and there is no right or wrong when it comes to keys?

No, you decide you didn’t really hit the ‘e’ or your key is gummed up or autocorrect did something weird. When you hit the ‘e’ key, you can’t be absolutely and perfectly certain that the ‘e’ will appear, but that’s probably what will happen, and if it doesn’t you aren’t in some swamp of postmodern relativism and lack of judgment.

Your experience typing shows that the binary promoted by a lot of media between absolutely certainty and hippy dippy relativism is a sloppy social construct. They want you to believe it, but your experience of typing, or making any other decision, shows it’s a false binary. You hit ‘e’ key, and you’re pretty near certain that an ‘e’ will appear. But you also know it might not, and you won’t collapse into some pile of cold sweat of clueless relativism if it doesn’t. You’ll clean your keyboard.

It’s the same situation with voting for someone, marrying someone, buying a new car, making dinner, painting a room. You can feel certain in the moment that you’re making the right decision, but any honest person has to admit that there are lots of times we felt totally and absolutely certain and turned out to have been mistaken. Feeling certain and being right aren’t the same thing.

That isn’t to say that the hippy-dippy relativists are right and all views are equally valid and there is no right or wrong—it’s to say that the binary between “the right answer is always obviously clear” and hippy-dippy relativism is wrong. For instance, in terms of the assertion that many people make that the distinction between right and wrong is absolutely obvious: is killing someone else right or wrong? Everyone answers that it depends. So, does that mean we’re all people with no moral compass? No, it means the moral compass is complicated, and takes thought, but it isn’t hopeless.

Our world is not divided into being absolutely certain and being lost in clueless hippy dippy relativism. But, and this is important, that is the black and white world described by a lot of media—if you don’t accept their truth, then you’re advocating clueless postmodern relativism. What those media say is that what you already believe is absolutely true, and, they say, if it turns out to be false, you never believed it, and they never said it. (The number of pundits who advocated the Iraq invasion and then claimed they were opposed to it all along is stunning. Trump’s claiming he never supported the invasion fits perfectly what with Philip Tetlock says about people who believe in their own expertise.)

And that you have been and always be right is a lovely, comforting, pleasurable message to consume. It is the delicate whipped cream of citizenship—that you, and people like you, are always right, and never wrong and you can just rely on your gut judgment. Of course, the same media that says it’s all clear has insisted that something is absolutely true that turned out not to be (Saddam Hussein has weapons of mass destruction, voting for Reagan will lead to the people’s revolution, Trump will jail Clinton, Brad Pitt is getting back together with Angelina Jolie, studies show that vaccines cause autism, the world will end in 1987). The paradox is that people continue to consume and believe media who have been wrong over and over, and yet are accepted as trusted authorities because they have sometimes been right, or, more often, because, even if wrong, what they say is comforting and assuring.

But, what happens when media say that Trump has a plan to end ISIS and then it turns out his plan is to tell the Pentagon to come up with a plan? What happens when the study that people cite to say autism is caused by vaccines turns out to be fake? Or, as Leon Festinger famously studied, what happens when a religion says the world will end, and it doesn’t? What happens when something you believe that fits with everything else you believe and is endorsed by authorities you believe turns out to be false? You could decide that maybe things aren’t simple choices between obviously true and obviously false, but that isn’t generally what people do. Instead, we recommit to the media because now we don’t want to look stupid.

Maybe it would be better if we all just decided that complicated issues are complicated, and that’s okay.

There are famous examples that show the simple truth test—you can just trust your perception—is wrong.

For instance, there is this example.

If you’re looking at paint swatches, and you want a darker color, you can look at two colors and decide which is darker. You might be wrong. Here’s a famous example of our tendency to interpret color by context.

Those examples look like special cases, and they (sort of) are: if you know that you have a dark grey car, and there is a grey and dark grey car in the parking lot, you don’t stand in the parking lot paralyzed by not knowing which car is yours because you saw something on the internet that showed your perception of darkness might be wrong. That experiment shows you might be entirely wrong, but you will not go on in your life worrying about it.

But you have been wrong about colors. And we’ve all tried to get into the wrong car, but in those cases we get instant feedback that we were wrong. With politics it’s more complicated, since media that promoted what turns out to have been a disastrous decision can insist they never promoted it (when Y2K turned out not to be a thing, various radio stations that had been fear mongering about it just never mentioned it again), claim it was the right decision, or blame it on someone else. They can continue to insist that their “truth” is always the absolutely obvious decision and that there is binary between being certain and being clueless. But, in fact, our operative truth test in the normal daily decisions we make is one that involves skepticism and probability. Sensible people don’t go through life with a yes/no binary. We operate on the basis of a yes/various degrees of maybe/no continuum.

What’s important about optical illusions is that they show that the notion central to a lot of argutainment—that our truth tests for politics should involve being absolutely certain that our group is right or else you’re in the muck of relativistic postmodernism—isn’t how we get through our days. And that’s important. Any medium, any pundit, any program, that says that decisions are always between us and them is lying to us. We know, from decisions about where to park, what stylist to use, what to make for dinner, how to get home, that it isn’t about us vs. them: it’s about making the best guesses we can. And we’re always wrong eventually, and that’s okay.

We tend to rely on what social psychologists call heuristics—meaning mental short cuts—because you can’t thoroughly and completely think through every decision. For instance, if you need a haircut, you can’t possibly thoroughly investigate every single option you have. You’re likely to have method for reducing the uncertainty of the decision—you rely on reviews, you go where a friend goes, you just pick the closest place. If a stylist says you have to shave your head or do nothing, you’ll walk away.

You might tend to have the same thing for breakfast, or generally take the same route to work, campus, the gym. Your route will not be the best choice some percentage of the time because traffic, accidents, or some random event will make your normal route slower than others from time to time (if you live in Austin, it will be wrong a lot). Even though you know that you can’t be certain you’re taking the best route to your destination, you don’t stand in your apartment doorway paralyzed by indecision. You aren’t clueless about your choices—you have a lot of information about what tends to work, and what conditions (weather, a football game, time of day, local music festivals, roadwork) are likely to introduce variables in your understanding of what is the best route. You are neither certain nor clueless.

And there are dozens of other decisions we make every day that are in that realm of neither clueless nor certain: whether you’ll like this movie, if the next episode of a TV program/date/game version/book in a series/cd by an artist/meal at a restaurant will be as good as the last, whether your boss/teacher will like this paper/presentation as much as the previous, if you’ll enjoy this trip, if this shirt will work out, if this chainsaw will really be that much better, if this mechanic will do a good job on your car, if this landlord will not be a jerk, if this class/job will be a good one.

We all spend all of our time in a world in which we must manage uncertainty and ambiguity, but some people get anxious when presented with ambiguity and uncertainty, and so they talk (and think) as so there is an absolute binary between certain and clueless, and every single decision falls into one or the other.

And here things get complicated. The people who don’t like uncertainty and ambiguity (they are, as social psychologists say, “drawn to closure”) will insist that everything is this or that, black or white even though, in fact, they continually manage shades of grey. They get in the car or walk to the bus feeling certain that they have made the right choice, when their choice is just habit, or the best guess, or somewhere on that range of more or less ambiguous.

So, there is a confusion between certainty as a feeling (you feel certain that you are right) and certainty as a reasonable assessment of the evidence (all of the relevant evidence has been assessed and alternative explanations disproven)—as a statement about the process of decision-making. Most people use it in the former way, but think they’re using it in the latter, as though the feeling of certainty is correlated to the quality of evidence. In fact, how certain people feel is largely a consequence of their personality type (On Being Certain has a great explanation of that, but Tetlock’s Expert Political Judgment is also useful). There’s also good evidence that the people who know the most about a subject tend to express themselves with less certainty than people who are un- or misinformed (the “Dunning-Kruger effect”).

What all that means is that people who get anxious in the face of ambiguity and uncertainty resolve that anxiety by feeling certain, and using a rigid truth test. So, the world isn’t rigidly black or white, but their truth test is. For instance, it might have been ambiguous whether they actually took the best route to work, but they will insist that they did, and that they obviously did. They managed uncertainty and ambiguity by denying it exists. This sort of person will get actively angry if you try to show them the situation is complicated.

They manage the actual uncertainty of situations by, retroactively, saying that the right answer was absolutely clear.[1] That sort of person will say that “truth test” is just simply asking yourself if something is true or not. Let’s call that the simple truth test, and the people who use it simple truth testers.

The simple truth test has two parts: first, does this claim fit with what I already believe? and, second, do authorities I consider reliable promote this claim?

People who rely on this simple truth test say it works because, they believe, the true course of action is always absolutely clear, and, therefore, it should be obvious to them, and it should be obvious to people they consider good. (It shouldn’t be surprising that they deny having made mistakes in the past, simply refashioning their own history of decisions—try to find someone who supported the Iraq invasion or was panicked about Y2K.)

The simple truth test is comfortable. Each new claim is assessed in terms of whether it makes us feel good about things we already believe. Every time we reject or accept a claim on the basis of whether it confirms our previous beliefs it confirms our sense of ourselves as people who easily and immediately perceive the truth. Thus, this truth test isn’t just about whether the new claim is true, but about whether they and people like them are certainly right.

The more certain we feel about a claim, the less likely we are to doublecheck whether we were right, and the more likely we are to find ways to make ourselves have been right. Once we get to work, or the gym, or campus, we don’t generally try to figure out whether we really did take the fastest route unless we have reason to believe we might have been mistaken and we’re the sort of person will to consider that we might have been mistaken.

There’s a circle here, in other words: the sort of person who believes that there is a binary between being certain and being clueless, and who is certain about all of her beliefs, is less likely to do the kind of work that would cause her to reconsider her sense of self and her truth tests. Her sense of herself as always right appears to be confirmed because she can’t think of any time she has been wrong. Because she never looked for such a time.

Here I need to make an important clarification: I’m not claiming there is a binary between people who believe you’re either certain or clueless and people who believe that mistakes in perception happen frequently. It’s more of a continuum, but a pretty messy one. We’re all drawn to black or white thinking when we’re stressed, frightened, threatened, or trying to make decisions with inadequate information. Most people have some realms or sets of claims they think are certain (this world is not a dream, evolution is a fact, gravity happens). Some people need to feel certain about everything, and some people don’t need to feel certain much at all, and a lot of people feel certain about many things but not everything.

Someone who believes that her truth tests enable certainty on all or most things will be at one end of the continuum, and someone who managed to live in a constant state of uncertainty would be at the other. Let’s call the person at the “it’s easy to be certain about almost everything important” authoritarian (I’ll explain the connection better later).

Authoritarians have trouble with the concept of probabilities. For instance, if the weather report says there will be rain, that’s a yes/no. And it’s proven wrong if the weather report says yes and there is no rain. But if the weather report says there is a 90% chance of rain and it doesn’t rain, the report has not been proven wrong.

Authoritarians believe that saying there is a 90% chance is just a skeezy way to avoid making a decision—that the world really is divided into yes or no, and some people just don’t want to commit. And they consume media that says exactly that.

This is another really important point: many people spend their consuming media that says that every decision is divided into two categories: the obviously right decision, and the obviously wrong one. And that media says that anyone who says that the right decision might be ambiguous, unclear, or a compromise is promoting relativism or postmodernism. So, as those media say, you’re either absolutely clear or you’re deep in the muck of clueless relativism. Authoritarians who consume that media are like the example above of the woman who believes that her certainty is always justified because she never checks to see whether she was wrong. They live in a world in which their “us” is always right, has always been right, and will always be right, and the people who disagree are wrong-headed ditherers who pretend that it’s complicated because they aren’t man enough to just take a damn stand.

(And, before I go on, I should say that, yes, authoritarianism isn’t limited to one political position—there are authoritarians all over the map. But, that isn’t to say that “both sides are just as bad” or authoritarianism is equally distributed. The distribution of authoritarianism is neither a binary nor a constant; it isn’t all on one side, but it isn’t evenly distributed.)

I want to emphasize that the authoritarian view—that you’re certain or clueless—is often connected to a claim that people are either authoritarians or relativists (or postmodernists or hippies) because there are two odd things about that insistence. First, a point I can’t pursue here, authoritarians rarely stick to principles across situations and end up fitting their own definition of relativist/postmodern. (Briefly, what I mean is that authoritarians put their group first, and say their group is always right, so they condemn behavior in them that they praise or justify in us. In other words, whether an act is good or bad is relative to whether it’s done by us or them—that’s moral relativism. So, oddly enough, you end up with moral relativism attacked by people who engage in it.) Second, even authoritarians actually make decisions in a world of uncertainty and ambiguity, and don’t use the same truth test for all situations. When their us turns out to be wrong, then they will claim the situation was ambiguous, there was bad information, everyone makes mistakes, and go on to insist that all decisions are unambiguous.

So, authoritarians say that all decisions are clear, except when they aren’t, and that we are always right, except when we aren’t. But those unclear situations and mistakes should never be taken as reasons to be more skeptical in the future.

2. Back to Hitler

Okay, so how do most people decide whether their leader is like Hitler? (And notice that it is never about whether our leader is like Hitler.) If you believe in the simple two-part truth test, then you ask yourself whether their leader seems to you to be like Hitler, and whether authorities you trust say he is. And you’re done.

But what does it mean to be like Hitler? What was Hitler like?

There is the historical Hitler who was, I think, evil, but didn’t appear so to many people, and who had tremendous support from a lot of authoritarians, and there is the cartoon Hitler. Hitler was evil because he tried to exterminate entire peoples (and he started an unnecessary war, but that’s often left out). The cartoon version assumes that his ultimate goals were obvious to everyone from the beginning—that he came on the scene saying “Let’s try to conquer the entire world and exterminate icky people” and always stuck to that message, so that everyone who supported him knew they were supporting someone who would start a world war and engage in genocide.

But that isn’t how Hitler looked to people at the time. Hitler didn’t come across as evil, even to his opponents (except to the international socialists), until the Holocaust was well under way. Had he come across as evil he would never have gotten into power. While Mein Kampf and his “beerhall” speeches were clearly eliminationist and warmongering, once he took power his recorded and broadcasted speeches never mentioned extermination and were about peace. (According to Letters to Hitler, his supporters were unhappy when he started the war.) Hitler had a lot of support, of various kinds, and his actions between 1933 and 1939 actually won over a lot of people, especially conservatives and various kinds of nationalists, who had been skeptical or even hostile to him before 1933. His supporters ranged from the fans (the true believers), through conservative nationalists who wanted to stop Bolshevism and reinstate what they saw as “traditional” values, conservative Christians who objected to some of his policies but also liked a lot of them (such as his promotion of traditional roles for women, his opposition to abortion and birth control, his demonizing of homosexuality), and people of various political ideologies who liked that (they thought) he was making Germany respected again, had improved the economy, had ended the bickering and instability they associated with democratic deliberation, and was undoing a lot of the shame associated with the Versailles Treaty.

Until 1939, to his fans, Hitler came across as a truth-teller, willing to say politically incorrect things (that “everyone” knew were true), cut through all the bullshit, and be decisive. He would bring honor back to Germany and make it the military powerhouse it had been in recent memory; he would sideline the feckless and dithering liberals, crush the communists, and deal with the internal terrorism of the large number of immigrants in Germany who were stealing jobs, living off the state, and trying to destroy Germany from within; he would clean out the government of corrupt industrialists and financiers who were benefitting from the too-long deliberations and innumerable regulations. He would be a strong leader who would take action and not just argue and compromise like everyone else. He didn’t begin by imprisoning Jews; he began by making Germany a one-party state, and that involved jailing his political opponents.

Even to many people willing to work with him, Hitler came across as crude, as someone pandering to popular racism and xenophobia, a rabble-rouser who made absurd claims, and who didn’t always make sense, whose understanding of the complexities of politics appeared minimal. But conservatives thought he would enable them to put together a coalition that would dominate the Reichstag (the German Congress, essentially) and they could thereby get through their policy agenda. They thought they could handle him. While they granted that he had some pretty racist and extreme things (especially his hostility to immigrants and non-Christians, although his own record on Christian behavior wasn’t exactly great), they thought that was rabble-rousing he didn’t mean, a rhetoric he could continue to use to mobilize his base for their purposes, or that he could be their pitbull whom they could keep on a short chain. He instantly imposed a politically conservative social agenda that made a lot of conservative Christians very happy—he was relentless in his support for the notion that men earn money and women work in the home, homosexuality and abortion are evil [2], sexual immorality weakens the state, and his rhetoric was always framed in “Christian terms” (as Kenneth Burke famously argued—his rhetoric was a bastardization of Christian rhetoric, but it still relied on Christian tropes).

Conservative Christians (Christians in general, to be blunt) had a complicated reaction to him. Most Christian churches of the era were anti-Semitic, and that took various forms. There were the extreme forms—the passion plays that showed Jews as Christ-killers, who killed Christians for their blood at Passover, even religious festivals about how Jews stabbed consecrated hosts (some of which only ended in the 1960s).

There were also the “I’m not racist but” versions of Christian anti-Semitism promoted by Catholic and Protestant organizations (all of this is elegantly described in Antisemitism, Christian Ambivalence, and the Holocaust). Mainstream Catholic and Lutheran thought promoted the notion that Jews were, at best, failed Christians, and that the only reason not to exterminate them was so that they could be converted. There was, in that world, no explicit repudiation of the sometimes pornographic fantasies of greedy Jews involved in worldwide conspiracies, stabbing the host, drinking the blood of Christian boys at Passover, and plotting the downfall of Germany. And there was certainly no sense that Christians should tolerate Jews in the sense of treating them as we would want to be treated; it simply meant that they shouldn’t be killed. As Ian Kershaw has shown, a lot of German Christians didn’t bother themselves about oppression (even killing) of Jews, as long at it happened out of their ken; they weren’t in favor of killing Jews, but, as long as they could ignore it was happening, they weren’t going to do much to protest (Hitler, The Germans, and the Final Solution).

Many of his skeptics (even international ones) were won over by his rhetoric. His broadcast speeches emphasized his desire for peace and prosperity; they liked that he talked tough about Germany’s relations to other countries (but didn’t think he’d lead them into war), they loved that he spent so much of his own money doing good things for the country (in fact, he got far more money out of Germany than he put into it, and he didn’t pay taxes—for more on this, see Hitler at Home), and they loved that he had the common touch, and didn’t seem to be some inaccessible snob or aristocrat, but a person who really understood them (Letters to Hitler is fascinating for showing his support). They believed that he would take a strong stance, be decisive, look out for regular people, clear the government of corrupt relationships with financiers, silence the kind of people who were trying to drag the nation down, and cleanse the nation of that religious/racial group that was essentially ideologically committed to destroying Germany.

There were a lot of people who thought Hitler could be controlled and used by conservative forces (Van Papen) or was a joke. In middle school, I had a teacher who had been in the Berlin intelligentsia before and during the war, and when asked why people like her didn’t do more about Hitler, she said, “We thought he was a fool.” Many of his opponents thought he would never get elected, never be given a position of power.

But still, some students say, you can see in his early rhetoric that there was a logic of extermination. And, yes, I think that’s true, but, and this is important, what makes you think you would see it? Smart people at the time didn’t see it, especially since, once he got a certain level of attention he only engaged in dog whistle racism. Look, for instance, at Triumph of the Will—the brilliant film of the 1934 Nazi rally in Nuremburg—in which anti-Semitism appears absent. The award-winning movie convinced many that Hitler wasn’t really as anti-Semitic as Mein Kampf might have suggested. But, by 1934, true believers had learned their whistles—everything about bathing, cleansing, purity, and health was a long blow on the dog whistle of “Jews are a disease on the body politic.” Hitler’s first speech on the dissolution of the Reichstag (March 1933) never uses the word Jew, and looked reasonable (he couldn’t control himself, however, and went back to his non-dog whistle demagoguery in what amounted to the question and answer period—Kershaw’s Hubris describes the whole event).

We focus on Hitler’s policy of extermination, but we don’t always focus enough on his foreign policy, especially between 1933 and 1939. Just as we think of Hitler as a raging antisemite (because of his actions), so we think of him as a warmonger, and he was both at heart and eventually, but he managed not to look that way for years. That’s really, really important to remember. He took power in 1933, and didn’t show his warmongering card till 1939. He didn’t show his exterminationist card till even later.

Hitler’s foreign policy was initially tremendously popular because he insisted that Germany was being ill-treated by other nations, was carrying a disproportionate burden, and was entitled to things it was being denied. Hitler said that Germany needed to be strong, more nationalist, more dominating, more manly in its relations with other nations. Germany didn’t want war, but it would, he said, insist upon respect.

Prior to being handed power, Hitler talked like an irresponsible war-monger and raging antisemite (especially in Mein Kampf), but his speeches right up until the invasion of Poland were about peace, stability, and domestic issues about helping the common working man. Even in 1933-4, the Nazi Party could release a pamphlet with his speeches and the title Germany Desires Work and Peace.

What that means is that from 1933 to 1939 Hitler managed a neat rhetorical trick, and he did it by dog whistles: he persuaded his extremist supporters that he was still the warmongering raging antisemite they had loved in the beerhalls and for whom Streicher was a reliable spokesman, and he persuaded the people frightened by his extremism that he wasn’t that guy, he would enable them to get through their policy agenda. (His March 1933 speech is a perfect example of this nasty strategy, and some day I intend to write a long close analysis of it.)

And even many of the conservatives who were initially deeply opposed to him came around because he really did seem to be effective at getting real results. He got those results by mortgaging the German economy, and setting up both a foreign policy and economic policy that couldn’t possibly be maintained without massive conquest; it had short-term benefits, but was not sustainable.

Hitler benefitted by the culture of demagoguery of Weimar Germany. After Germany lost WWI, the monarchy was ended, and a democracy was imposed. Imposing democracy is always vexed, and it doesn’t always work because democracy depends on certain cultural values (a different post). One of those values is seeing pluralism—that is, diversity of perspective, experience, and identity—as a good thing. If you value pluralism, then you’ll tend to value compromise. If you believe that a strong community has people with different legitimate interests, points of view, and beliefs, then you will see compromise as a success. If, however, you’re an authoritarian, and you believe that you and only you have the obvious truth and everyone else is either a knave or a fool, then you will see refusing to compromise as a virtue.

And then democracy stalls. It doesn’t stall because it’s a flawed system; it stalls when people reject the basic premises of democracy, when, despite how they make decisions about how to get to work in the morning, or whether to take an umbrella, they insist that all decisions are binaries between what is obviously right (us) and what is obviously wrong (them).

And, in the era after WWI, Germany was a country with a democratic constitution but a rabidly factionalized set of informational caves. People could (and did) spend all their time getting information from media that said that all political questions are questions of good (us) and evil (them). Those media promoted conspiracy theories—the Protocols of the Elders of Zion, for instance—insisted on the factuality of non-events, framed all issues as apocalyptic, and demonized compromise and deliberating. They said it’s a binary. The International Socialists said the same thing, that anything other than a workers’ revolution now was fascism, that the collapse of democracy was great because it would enable the revolution. Monarchists wanted the collapse of the democracy because they hoped to get a monarchy back, and a non-trivial number of industrialists wanted democracy to collapse because they were afraid people would vote for a social safety net that would raise their taxes.

It was a culture of demagoguery.

But, in the moment, large numbers of people didn’t see it that way because, if you were in a factional cave, and you used the two-step test, everything you heard in your cave would seem to be true. Everything you heard about Hitler would fit with what you already believed, and it was being repeated by people you trusted.

Maybe what you heard confirmed that he would save Germany, that he was a no-bullshit decisive leader who really cared about people like you and was going to get shit done, or maybe what you heard was that he was a tool of the capitalists and liberals and that you should refuse to compromise with them to keep him out of power. Whether what you heard was that Hitler was awesome or that he was completely wrong, what you heard was that he was obviously one or the other, and that anyone who disagreed with you was evil. What you heard was the disagreement itself was proof that evil was present. And heard democracy was a failure.

And that helped Hitler, even the attacks on him . As long as everyone agreed that the truth is obvious, that disagreement is a sign of weakness, the compromise is evil, then an authoritarian like Hitler would come along and win.

There were a lot of people who more or less supported the aims he said he had—getting Germany to have a more prosperous economy, fighting Bolshevism, supporting the German church, avoiding war, renegotiating the Versailles Treaty, purifying Germany of anti-German elements, making German politics more efficient and stable—but who thought Hitler was a loose cannon and a demagogue. Many of those were conservatives and centrists.

And, once Hitler was in power they watched him carefully. And, really, all his public speeches, especially any ones that might get international coverage, weren’t that bad. They weren’t as bad as his earlier rhetoric. There wasn’t as much explicit anti-Semitism, for instance, and, unlike in Mein Kampf, he didn’t advocate aggressive war. He said, over and over, he wanted peace. He immediately took over the press, but, still and all, every reader of his propaganda could believe that Hitler was a tremendously effective leader, and, really, by any standard he was: he effected change.

There wasn’t, however, much deliberation as to whether the changes he effected were good. He took a more aggressive stance toward other countries (a welcome change from the loser stance adopted from the end of WWI, which, technically, Germany did lose), he openly violated the deliberately shaming aspects of the Versailles Treaty, he appeared to reject the new terms of the capitalism of the era (he met with major industrial leaders and claimed to have reached agreements that would help workers), he reduced disagreement, he imprisoned people who seemed to many people to be dangerous, he enacted laws that promoted the cultural “us” and disenfranchised “them.” And he said all the right things. At the end of his first year, Germany published a pamphlet of his speeches, with the title “The New Germany Desires Work and Peace.” So, by the simple two-art truth test (do the claims support what you already believe? do authorities you trust confirm these claims?) Hitler’s rhetoric would look good to a normal person in the 30s. Granted, his rhetoric was always authoritarian—disagreement is bad, pluralism is bad, the right course of action is always obvious to a person of good judgment, you should just trust Hitler—but it would have looked pretty good through the 30s. A person using that third test—can I find evidence to support these claims—would have felt that Hitler was pretty good.

3. So, would you recognize Hitler if you liked what he was saying?

What I’m trying to say is that asking the question of “Is their political leader just like Hitler” is just about as wrong as it can get as long as you’re relying on simple truth tests.

If you get all your information from sources you trust, and you trust them because what they say fits in with your other beliefs, then you’re living in a world of propaganda.

If you think that you could tell if you were following a Hitler because you’d know he was evil, and you are in an informational cave that says all the issues are simple, good and evil are binaries and easy to tell one from another, there is either certainty or dithering, disagreement and deliberation are what weak people do, compromise is weakening the good, and the truth in any situation is obvious, then, congratulations, you’d support Hitler! Would you support the guy who turned out to start a disastrous war, bankrupt his nation, commit genocide? Maybe—it would just be random chance. Maybe you would have supported Stalin instead. But you would definitely have supported one or the other.

Democracy isn’t about what you believe; it’s about how you believe. Democracy thrives when people believe that they might be wrong, that the world is complicated, that the best policies are compromises, that disagreement can be passionate, nasty, vehement, and compassionate–that the best deliberation comes when people learn to perspective shift. Democracy requires that we lose gracefully, and it requires, above all else, that we don’t assess policies purely on whether they benefit people like us, but that we think about fairness across groups. It requires that we do unto others as we would have them do unto us, that we pass no policy that we would consider unfair if we were in all the possible subject positions of the policy. Democracy requires imagining that we are wrong.

[1] That sort of person often ascribes to the “just world model” or “just world hypothesis” which is the assumption that we are all rewarded in this world for our efforts. If something bad happens to you, you deserved it. People who claim that is Scriptural will cherry-pick quotes from Proverbs, ignoring what Jesus said about rewards in this world, as well as various other important parts of Scripture (Ecclesiastes, Job, Paul).

[2] There is a meme circulating that Hitler was pro-abortion. His public stance was opposition to abortion at least through the thirties. Once the genocides were in full swing, Nazism supported abortion for “lesser races.”