Sources of Bias in Reasoning – Part 5
Why myside bias is worse among self-styled intellectual elites.
This essay continues our examination of what Keith Stanovich, in his book The Bias That Divides Us: The Science and Politics of Myside Thinking, calls “myside bias,” namely our tendency to “evaluate evidence, generate evidence, and test hypotheses in a manner biased toward our own prior beliefs, opinions, and attitudes.” As explored in the previous essay, an American culture that came to embrace Martin Luther King Jr.’s vision of a colorblind society that evaluated people and policies independently of their immutable characteristics has since, within a couple generations, come to adopt a diametrically opposed vision that evaluates people and policies explicitly on the basis of race and other immutable characteristics. Such diametrically opposed visions cannot result from any sort of logical, syllogistic reasoning. So what caused the change? As Stanovich points out, it was influenced by cultural elites who lead universities and political parties. And as we will explore in this essay, such elite institutions are particularly subject to myside bias.
In a previous essay in this series, we explored how our convictions are often arrived at independently of rational argument. And as Stanovich writes:
The results of studies showing how the convictions that drive myside bias may have been acquired unreflectively converge nicely with those of studies [that show] that the avoidance of myside bias fails to correlate with intelligence. A sign that our convictions are acquired reflectively would be if a primary mechanism leading to well-calibrated opinions—the avoidance of myside bias—were strongly correlated with intelligence. The lack of such a correlation itself must raise serious questions about how reflective we are in acquiring our convictions.
Now, one might think that if someone is better educated, or is better at math, then they’d be better able to “think straight,” and much better at countering the effects of “myside” thinking in their own assessment of things. But it turns out, after much research, that’s not the case at all. As Stanovich writes:
the myside bias [found] was not any stronger in the subjects in the sample of lower cognitive ability than it was in those of higher cognitive ability … The failure of intelligence to attenuate myside bias extends to variables that are closely related to intelligence such as numeracy, scientific literacy, and general knowledge.
Indeed, people who are smarter or better at math might be at a particular disadvantage in this case, because they may tend to use those greater skills to simply find more sophisticated ways to focus only on evidence that confirms their preexisting convictions, making their myside bias even more intractable:
Drummond and Fischhoff (2019) found that the level of scientific reasoning skills did not correlate with the magnitude of myside bias displayed. Indeed, in some of their experiments, there was a slight tendency for subjects with higher levels of scientific reasoning skills to show even greater myside bias than subjects with lower levels did … Dan Kahan, Ellen Peters, and colleagues (2012) found … this difference between groups was larger among subjects who were rated high in measured numeracy than among those who were rated low … [M]ost people expect that higher levels of general intelligence, numeracy, scientific literacy, and general knowledge will bring subjects together in their views of the facts, but this was not the case in the Kahan, Peters, and colleagues 2012 study. Indeed, higher numeracy was associated with greater belief polarization between the groups … The study by Van Boven and colleagues (2019) [showed] myside bias displayed was actually greater for the subjects who were more highly numerate, but who apparently used their superior numerical reasoning skills not to reason in an unbiased manner across the different conditions, but to figure out which probability was more favorable to their side of the issue … As Peter Ditto and colleagues (2019b, 312) note, “What if bias is not the sole province of the simpleminded? … A growing body of research suggests that greater cognitive sophistication and expertise often predict greater levels of political bias, not less.… Cognitive sophistication may allow people to more skillfully argue for their preferred conclusions, thus improving their ability to convince others—and themselves—that their beliefs are correct.” … If you are a person of high intelligence, if you are highly educated, and if you are strongly committed to an ideological viewpoint, you will be highly likely to think you have thought your way to your viewpoint. And you will be even less likely than the average person to realize that you have derived your beliefs from the social groups you belong to and because they fit with your temperament and your innate psychological propensities.
And where do you find a vastly disproportionate number of people of high intelligence, who are highly educated, and strongly committed to an ideological viewpoint? At American universities:
There is in fact a group of people who are highly intelligent, highly educated, and strongly committed to an ideological viewpoint. That group happens to be the social scientists who study myside bias! They provide a case study of the effects of the tragedy of the communications commons.
Stanovich writes that
The extensive “diversity” and “inclusion” administrative infrastructure no longer assumes that the role of the university is to help students construct a unique model of themselves as they incorporate the best thoughts that the world’s cultures have preserved. Instead, the diversity infrastructure assumes that social forces have already given the student a preordained identity and that the role of the university is to affirm the student’s attachment to that identity.
Promoting the affirmation of all views grounded in identity can’t succeed as an intellectual project because, as Stanovich writes:
The problem with all nonempirically based systems of belief is that they have no mechanism for deciding among conflicting claims. When all disputants in an argument base their claims on lived experience, but the claims conflict, how do we decide whose lived experience actually reflects the truth? Sadly, history has shown that such conflicts usually result in power struggles. Rather than relying on personal experience, science makes its knowledge claims public so that conflicting ideas can be tested in a way that is acceptable to all disputants. True scientific claims are made in the public realm, where they can be criticized, tested, improved, or perhaps rejected. This allows a selection among theories to take place by peaceful mechanisms that all disputants agree on in advance, which is why science has been a major humanizing force in human history … Ironically, in the 1970s, it was seen as politically progressive to move students from personal worldviews to scientific worldviews—to move students from egocentric perspectives (“Speaking as an X”) to “the view from nowhere.” The larger assumption was that revealing the objective truth about the human condition would aid in—not impede—constructing a just society. This mindset has been lost in the modern university.
Stanovich then recites a litany of statistics showing the monolithic nature of the one-sided views represented on university campuses:
The university professoriate is overwhelmingly liberal, an ideological imbalance demonstrated in numerous studies conducted over the last two decades (Abrams 2016; Klein and Stern 2005; Langbert 2018; Langbert and Stevens 2020; Peters et al., 2020; Rothman, Lichter, and Nevitte 2005; Wright, Motz, and Nixon 2019). This imbalance is especially strong in university humanities departments, schools of education, and the social sciences; it is specifically strong in psychology and the related disciplines of sociology and political science, where much of the study of myside bias occurs (Buss and von Hippel 2018; Cardiff and Klein 2005; Clark and Winegard 2020; Duarte et al. 2015; Horowitz, Haynor, and Kickham 2018; Turner 2019) … Studies of social science departments in universities have indicated that 58 to 66 percent of professors identify themselves as liberals and just 5 to 8 percent as conservatives (Duarte et al. 2015). The imbalance in psychology departments is even worse, with 84 percent of professors identifying themselves as liberals and just 8 percent as conservatives. This imbalance has grown far more pronounced in recent years. In 1990, the ratio was four liberals for every one conservative in psychology departments—a strong imbalance, but still, the 20 percent of faculty members who were conservatives at least provided some diversity. But, by the year 2000, the ratio had grown to six liberals for every one conservative (Duarte et al. 2015). And, by the year 2012, the ratio had risen to an astonishing 14 to 1 … People’s distal political attitudes are intertwined with their prior beliefs on a wide range of issues, such as sexuality, morality, the psychological effects of poverty, family structures, crime, child care, productivity, marriage, incentives, discipline techniques, and educational practices. It is in these areas where we would be most concerned that the political ideology of the researchers might affect how they designed their studies or how they interpreted the results.
The American system of justice is based on the “adversary system,” which is designed to allow each side of a dispute their own advocate who will zealously argue their side of the case to help ensure all arguments can be considered in context by the jury, and justice will thereby be done. The same principle applies in the proper evaluation of scientific claims, and when there’s not enough diversity of thought to apply that principle, myside thinking runs wild:
Science works as well as it does not because scientists themselves are never biased but because scientists are immersed in a system of checks and balances—where other scientists with differing biases are there to critique and correct. The bias of researcher A might not be shared by researcher B, who will then look at A’s results with a skeptical eye. Likewise, when researcher B presents a result, researcher A will tend to be critical and look at it with a skeptical eye. It should be obvious how this scientific process of error detection and cross-checking can be subverted when all the researchers share the same bias—and that bias bears directly on the research at hand. Unfortunately, this seems to be the case in the field of psychology, where the virtual homogeneity of political ideology among researchers means there can be no assurance that politically charged topics like those mentioned here will be approached with the necessary scientific objectivity.
Unlike the monoculture of opinion that universities tend to be, the two major political parties in America have roughly equal adherents by political affiliation and routinely challenge each other’s proposed policies. But the combined set of policies each party uses to define itself will tend to lock each party’s members into a tribal devotion of that set of policies as a whole, even if individual party members might otherwise come to different positions. Evidence for this is that American voters “often don’t know their stance on many issues until they hear the stance supported by their own partisan group.” And as Stanovich further explains:
[S]ome of the mysided behavior generated by partisanship is, in a certain sense, unnecessary. That is, we would not have the conviction driving our mysided behavior on many issues had we not known the position taken by our partisan group … Our sense of partisanship, Mason (2018a) concluded, is more about our sense of connection to like-minded others than about our stances on specific issues; her conclusion converges with that of other studies finding that partisanship is a matter more of identity-based expression than of issue-based instrumental calculation (Huddy, Mason, and Aaroe 2015; Johnston, Lavine, and Federico 2017).
Other evidence indicates that non-elite citizens will tend to have varied positions on issues, many of which contradict some specific individual policy positions of the political party to which they belong. Yet elites tend to develop more monolithic policy views in line with those of the political party with which they identify over time:
Four-fifths of the subjects in the Weeden and Kurzban (2016) sample showed very little cross-domain ideological consistency, and their degree of consistency did not seem to increase over time. In contrast, the cross-domain consistency among the cognitive elite 20 percent in their sample was much higher and, more important, it did increase over time. This suggests a model where a relatively unideological population, through identification with parties controlled by cognitive elites, is being led to support positions on issues they would not otherwise have held.
As Stanovich writes, “Once we have decided on a partisan side, we tend to allow party elites to bundle specific issue positions for us. In many cases, we have given no thought to the issue beyond knowing that our party endorses a certain stance toward it.”
Stanovich advises as follows:
Stop trying to be consistent because many of the issues that are bundled together to define our political parties are simply not linked by any consistent principle. They have instead been bundled for political expediency by partisan elites on both sides. As Christopher Federico and Ariel Malka (2018, 34–35) note: “Political elites strategically bundle substantively distinct political preferences and attempt to ‘sell’ these bundles as ideologies to the general public … Such menu-dependent and expressive motivation can sometimes lead one to adopt an issue position … that is opposite of the position one might have ‘organically’ preferred.” If major political issues really should not cohere very strongly with one another, then maybe those of us who wish to tame our own tribe-based myside bias (Clark and Winegard 2020) should be skeptical about taking a position on an issue we know nothing about just because that’s the position our party takes … It is important to realize that, by being a partisan, you will often end up supporting positions that you would not otherwise have taken were it not for the fact that these positions have been bundled by the political operatives of your party for reasons of electoral advantage. At worst, by being a partisan, you will end up supporting positions you do not actually believe in … In fact, each side in our partisan debates often argues—convincingly—how positions on the other side are inconsistent and aligned in a seemingly incoherent manner, and such arguments are often quite effective … It is not clear why forgiving student loan debt is a position of the Democratic Party—the party that deplores wealth inequality—when forgiving student loan debt favors the more affluent (Looney 2019). The expansion of charter schools is opposed by the Democratic Party and a majority of its white voters (Barnum 2019). However, a bundling conflict arises around this issue position because the party purports to be the advocate of minorities—but in fact a much greater percentage of African Americans and Hispanics than of white Democrats support the expansion of charter schools. Of course, the electoral calculus here is no secret. The Democratic Party wants to placate the teacher unions who are part of its coalition. But that is just my point. This bundling is done for political, not principled, reasons.
Stanovich also notes that “Decades ago, the Democratic Party in the United States encompassed a substantial minority of pro-life proponents, but this is much less true now (when a pro-life Democratic candidate for national office is virtually unthinkable). The significant presence of these crosscutting identities used to keep partisan animus in check.” (And all these points, of course, apply equally to the Republican Party and its dynamics as well.) This political sorting follows Americans’ self-sorting geographically:
We increasingly live and socialize with people who are like us not only in their demographics but also in their personal habits, recreational choices, lifestyles, and politics. The Big Sort results in interesting clustering patterns of political and lifestyle choices. For example, in the 2012 election, Barack Obama won 77 percent of the counties in America that had a Whole Foods grocery store, but he won only 29 percent of the counties in America that had a Cracker Barrel restaurant (Wasserman 2014, 2020). Thus we are increasingly living near people who share our views politically (Golman et al. 2016). Bill Bishop (2008), for example, calculated how many Americans lived in “landslide counties”—counties where, in a presidential election, one candidate won by 20 percent or more. He found that in the 1976 presidential election, 26.8 percent of Americans lived in landslide counties, whereas, by the 2008 presidential election, 47.6 percent—almost half—of all Americans lived in counties that tilted strongly toward one political party over the other.
The problem, as Stanovich sees it, is that
When our social and tribal allegiances come into play, we tend to act more “groupish,” to use Jonathan Haidt’s (2012) term, and an aspect of that “groupishness” is that we tend to start projecting convictions—we tend to show more nonnormative myside bias. To use Mason’s (2015) simile, we begin to behave more like sports fans than like bankers choosing an investment (see Johnston, Lavine, and Federico 2017). There is good news here from the standpoint of giving us tools to reason better—that is, with less myside bias about specific issues. The moral here, regarding specific issues, is that we are closer to our fellow citizens than it seems from the social level of our tribe-based allegiances and lifestyles. These, then, are the implications of the demographic Big Sort originally described by Bishop (2008), as it has increasingly spilled into the political domain. What the political parties seem to have done (to use an analogy we all understand) is collect every type of kid you hated from your high school and gather them together. Those kids you hated have grown up now and, not surprisingly, they look totally different than you do, their lifestyles are different from yours, and they have teamed up with other people whose lifestyles you don’t like—and they have all joined the other political party! But, despite this, what the research shows is that, on the specific issue at hand, whatever it happens to be (tax deductions for minor children, expanding charter schools, minimum wage, and so on), the chances are that our fellow citizens do not feel all that different than we do. If we mildly support the issue proposition in question, the chances are that even if they oppose it, others oppose it only mildly. We are not that far apart from our fellow citizens on the other side as long as we focus on a specific issue and not on overall lifestyle. And, of course, focusing on the issue and resisting the temptation to project our convictions is exactly what we should be doing to avoid myside bias in a political argument.
Understanding this dynamic should make people less reticent about speaking their mind on political issues to others, knowing that others may well have more of an open mind on some issues despite the stated position of their own political party:
When arguing about a political issue, your opponents may be closer to you than you think because, in many cases, the issue isn’t the issue—the tribe is. What I mean by this is that the divisiveness in our modern society is driven by party affiliation more than by people becoming more extreme or more consistent in their actual positions. It is important to understand the statistical logic by which this can happen. It may make us more sympathetic toward our fellow citizens if we know that the sources of affective alienation from them derive more from business strategies of the media and electoral strategies of parties rather than from people drifting farther apart in their basic beliefs about American society.
In the meantime, Stanovich writes that
There are, however, severe costs associated with myside bias at the societal level. In the United States (and in many other Western nations), political parties and ideologies have become the equivalent of modern tribes … Regrettably, we have let these tribes run roughshod over the nation’s cognitive life—putting nonintellectual strategies designed to “score points” for the tribe ahead of objective debate about specific public policy issues. The myside thinking fueled by these tribal politics has rendered any consideration of evidence virtually moot in public policy disputes … For their part, the universities have totally abdicated their responsibility to be neutral, unbiased arbiters of evidence on controversial issues. Instead, they have turned themselves into intellectual monocultures that police expression through political correctness in precisely the areas where we need open discussion the most: crime, immigration, poverty, abortion, affirmative action, drug addiction, race relations, and distributional fairness.
This concludes this series of essays on sources of bias in reasoning. As Stanovich explains, it’s easy to convince ourselves that we’re doing the right thing, even when we aren’t, based on the actual (empirical) results of the policies and actions in which we engage. The challenge is to open our eyes to contrary arguments and evidence, and steel ourselves for the distinct possibility that we might be wrong. Once we break from a “myside” cocoon and come to experience a good faith change in our own opinions enough times, the experience will become less scary and more a normal part of an edifying learning experience, which will in turn tend to make us more comfortable learning more and more things about the world.
With such naturally biased tendencies at root, it’s as important as ever to encourage a culture in which people are free to speak their mind and publish their arguments. In the next series of essays, we’ll explore the fundamental connection between free speech and due process, how you can’t support one without the other, and how in the absence of either, only one bias can rule.
Links to all essays in this series: Part 1; Part 2; Part 3; Part 4; Part 5.