You hear this claim a lot. The very tribal: The tea partiers who get affinity fleeced by get rich quick schemes (that the liberals don't want you to know because they hate America); the struggling single mothers with serious health conditions, just barely hanging on for their young children due to Obamacare subsidies, voting straight-ticket Republican; James Inhofe bringing a snowball to the Senate floor to prove global warming is a conspiracy of virtually 100% of climate scientists worldwide.
Wait, isn't he our chairman of the Senate's Environment and Public Works Committee? Strange days indeed. Most peculiar mama.
You can't reach them with evidence and logic. They are too tribal, too anti-science.
But is this really true, or true even of the only moderately tribal and unscientific, and to that hopeless a degree?
I'll get right to the punch line: First, most people are not that extremely tribal. That's a relatively small minority. And, there are some studies showing that at least the non-extremely tribal and unscientific can often be swayed by logic, evidence, science; at least if applied again-and-again, over the long run.
This recently came up in Paul Krugman's discussion of why he's often not so "polite" or "civil". John Quiggin makes some very good additional points, and he notes:
The implicit, and false assumption in the anti-anti-anti-science position is that there is some better way of convincing the anti-science group to change their minds, for example by framing climate change in terms more congenial to political right-wingers. This is pretty clearly wrong. Long experience has shown that nothing is going to shift the right on an issue that has become a tribal shibboleth.
It is true that the more tribal people get, the more the brain goes out the window in making decisions and accessing reality. It's why the extreme right is so preyed upon by members of their own tribe. But what percentage of the population is this tribal?
Far less than a majority. For example, a 2013 Rasmussen poll has only 8% of likely voters saying they are Tea Party members, and a 2013 Gallup poll has just 22% saying they support the Tea Party movement.
It's true that our system still has some highly undemocratic flaws (although we've obviously come a long way from the days when only white male land owners could vote); a citizen in Wyoming has over 60 times the voting power in the Senate of a citizen in California; we have extreme gerrymandering, and Supreme Court justices who stop vote counting to install their candidate for President, and allow massive money to bend political will with few and diminishing limits.
But if the vast majority are only at most moderately tribal, they may still be eventually reachable by strong logic and evidence. And, a strong majority can still usually get what they want over the long run, at least if it's something they feel strongly about.
So can just the moderately tribal, the moderately resistant, be reached?
I was a top salesman at one time, and I'm a successful entrepreneur and hold an MBA from a top business school. So I have some knowledge and experience with sales and marketing, with persuasion. My conclusion is that often logic and evidence can be powerful if applied persistently and well, again and again and again, especially over the long run, even with people who are moderately tribal, or unscientific, or go largely by feel.
But what about the academic evidence? This is something I will eventually research thoroughly. But I did recently read this in support: It's from the Columbia Journalism Review:
So perhaps a single, credible refutation within a news article isn’t likely to convince people to change their views. But other research suggests that a constant flow of these kind of corrections could help combat misinformation. The theory is that the more frequently someone is exposed to information that goes against their incorrect beliefs, the more likely it is that they will change their views.
“It’s possible there is something to be said for persistence,” Reifler said. “At some point the cost of always being wrong or always getting information that runs counter to what you believe is likely to outweigh the cost of having to change your mind about something. We need to figure out what is the magic breaking or tipping point, or what leads people to get to that tipping point. I think we’re just scratching the surface.”
He pointed to a 2010 paper in Political Psychology by David P. Redlawsk and others, “The Affective Tipping Point: Do Motivated Reasoners Ever ‘Get It’?”
The researchers sought to determine if a tipping point exists that could cause voters to abandon motivated reasoning and view facts in a more rational way.
“We show experimental evidence that such an affective tipping point does in fact exist,” they write. “… The existence of a tipping point suggests that voters are not immune to disconfirming information after all, even when initially acting as motivated reasoners.”
This tipping point is far from being identified, but it’s encouraging to think that repeated efforts to debunk misinformation, or to simply to spread the truth, may have an effect.
So, it's not at all a closed case that: Of course you can't reach partisans, or the tribal, or those with a large propensity to make their decisions based on feel.
At least if they're not the very hard core (and only a relatively small minority are very hard core), it's not clear that the're impervious to logic, evidence, science. We should keep that in mind.
At least if they're not the very hard core (and only a relatively small minority are very hard core), it's not clear that the're impervious to logic, evidence, science. We should keep that in mind.
I'd also like to highlight a great point Quiggin made:
As Krugman points out, what matters is not the impact on the anti-science group themselves, but on the attitudes to that group among others. The recent measles epidemic didn’t have much of an impact on anti-vaxers, as far as I can see, but it certainly changed attitudes towards them, greatly reducing sympathy for their desire to pursue their deluded beliefs regardless of the risk to the rest of the community.
I think a good example is smoking. At one time, many people resisted the idea that tobacco was very harmful, despite already overwhelming scientific evidence, and it was a tribal and political issue for many. But what happened? Eventually those who held this unscientific view were repeatedly shamed by scientific evidence, again and again and again. Over time, even if the most hard core wouldn't change their views, they did die off. And the young, who are more open, grew up hearing the scientific evidence.
Moreover, the non-extremely tribal, who are the majority, and even "Balanced" journalists, grew more and more shamed not acknowledging the overwhelming science (and see the last part of this). So over time, it became an embarrassment to say, smoking is safe; it's all just a conspiracy. And very few would say this today. Over time, repeated hitting over the head with logic, evidence, science, worked. You just can't expect immediate results.