Mythology and moorings: Science surveys in cultural context

Lately I have felt inundated with survey results that highlight alarming disconnects between public beliefs about scientific matters and the consensus views of practicing scientists. Recently, for example, I learned that more than 40% of Americans believe astrology is at least “sort of” scientific (National Science Board 2014), that 46% of respondents to a national survey place human origins within the past 10,000 years (Gallup 2012), and that 26% of us believe that the Sun orbits the Earth, rather than the other way around (National Science Board 2014).

More alarming still have been the cases in which Americans’ beliefs about many topics, such as climate change (Gallup 2008a) and human origins (Gallup 2008b) fall along ideological splits. That so many of our nation’s cultural controversies concern not how best to employ our hard-won store of knowledge, but of whether, for reasons unrelated to reliability, selected elements of that cache should be believed at all, can be viewed as a casualty of the much-discussed trend toward greater political polarization. A question that does not receive much attention in the press, however, is what polarized responses to scientific ideas actually say about our collective proficiency and attitudes about science itself.

Theorists argue that an enlightened citizenry is essential to the functioning of democracy, and results like these suggest that our citizenry is, as relates to science, woefully uninformed. Popular interpretations focus on the science comprehension of survey respondents, or on their trust in science overall, and generally argue that incorrect answers to the sorts of questions highlighted above reflect a growing gap in science literacy. Popular remedies for this perceived problem tend to focus on level or quality of (science) education. Ongoing research, however, is complicating our understanding both of the character of the problem and potential responses to it, including what it means to be “informed.”

A first question is whether interpretations of relevant survey results are themselves robust. For example, during its 2012 Science and Engineering Indicators (SEI) survey, the National Science Foundation conducted a small experiment by adjusting the wording of its question concerning human origins: Half of the survey respondents were presented with the traditional prompt, “Human beings, as we know them today, developed from earlier species of animals,” and, as in previous years, about 48% disagreed. The other half of the sample was asked to respond to the same question, but with the addition of a lead-in clause–“According to the theory of evolution by natural selection”—and 72% answered correctly (National Science Board 2014). In a similar display of semantic sensitivity, 39% of respondents to a National Center for Science Education survey agreed that “God created the universe, the earth, the sun, moon, stars, plants, animals, and the first two people within the past 10,000 years” while only 18% agreed with the (logically weaker) proposition that “The earth is less than 10,000 years old” (Bishop et al. 2010).

A further complication arises when survey results are stratified by educational attainment. Consider two cases: The SEI has consistently found belief in astrology to decrease with educational attainment (National Science Board 2014), as expected under the hypothesis that education—or, more specifically, science literacy—tends to erode counter-scientific and pseudoscientific beliefs. In contrast, a recent survey found that rates of beliefs in the likely impact of, the present effects of, and the human causality behind climate change do not change dramatically with education—at least on average. Instead, they become polarized, as more liberal-minded or Democratic Party–leaning people adopt such views and the more conservative-minded or Republican Party–leaning eschew them (McCright 2011).

These results, and others similar to them, challenge popular characterizations of the survey results above as reflecting citizen ignorance and therefore are best addressed through improved education; indeed, they call into question whether knowledge, rather than culture, is actually being measured. In a series of studies that address these challenges, Dan Kahan and his collaborators have proposed “cultural cognition” as an alternative framework for understanding how individuals evaluate new information and revise beliefs—and, in particular, how that process does not necessarily lead people of divergent cultural identities to converge in their beliefs (see Kahan et al. 2011 in particular).

In an especially illuminating study (Kahan et al. 2012) Kahan and his colleagues collected survey data on participants’ perceptions of climate change risks. These were obtained from three measurements:

  • A slate of agree/disagree items used to situate their social organization worldviews along two axes (hierarchy–egalitarianism and individualism–communitarianism) (Douglas 1970);
  • Two several-item assessments, drawn from existing literature, designed to measure science literacy and aptitude for technical reasoning (numeracy); and
  • Personal assessments, along an eight-point scale, of a variety of risks, including some concerning climate change.

The inquiry produced two key findings. First, participants’ perceptions of climate change risk did not increase with science literacy or numeracy, as would be expected under the assumption that science comprehension is the key obstacle to aligning public and expert views. Second, when participants classified as hierarchical individualists and egalitarian communitarians were compared, those high in science literacy and numeracy were more polarized than their (already far separated) counterparts.

The latter result lends support to a widely-discussed hypothesis: We are as adept at applying reasoning tools acquired through education—such as logic, science literacy, numeracy, and critical thinking—in the service of maintaining existing beliefs as to reevaluate them. In particular, it helps explain the widened gulf between Democrats and Republicans of strong educational backgrounds.

As a mechanism, cultural cognition is the unconscious influence of normative predispositions on the selectiveness with which new information is incorporated into existing beliefs. To the extent that a person’s existing views are already closely aligned with their cultural values, cultural cognition looks just like confirmation bias (Kahan et al. 2011): For example, someone predisposed to favor social freedoms and safety nets is likely to have less favorable views of big business and to favor environmental regulation. Questioned about the risks involved with nuclear waste disposal, such an individual may already view the nuclear power industry with suspicion, but nonetheless seek out science writing or expert opinion specifically about risk. If, however, resources that downplay the risks take a pro-business tone, while those that highlight them appear pro-environment, that person will tend to ascribe less evidential weight to the former than to the latter, i.e., to apportion credibility in a way least disruptive to their cultural moorings.

This leads to a complementary view of cultural cognition as a more or less rational assessment of risk: Cognitive dissonance may be an important factor in belief reevaluation, e.g., in the face of expert consensus contrary to one’s own beliefs about a topic. However, that experience may be mild compared to the emotional, social, and even economic costs of dissenting from one’s cultural community. As Kahan has argued, this imbalance could be enough to ensure the discouraging survey results above, and more—they suggest why, via unconscious and often unrecognized predispositions, individuals will campaign and vote for policies that run counter to their best long-term interests (Kahan et al. 2012).

How, then, might the gap between public and expert views on such crucial matters as climate change be closed, and does this task necessarily entail the convergence of political factions on answers to science awareness surveys? While Kahan and his collaborators do not criticize those advocating for a continued emphasis on science literacy and critical thinking, they deemphasize these as explanatory factors for Americans’ public dissensus on matters of scientific consensus.  Instead, these authors suggest the science communication environment becomes “polluted” when cultural cues interfere with the transmission of value-independent messages. This insight highlights the risk that attends various forms of communication, such as delivery by advocates with partisan associations (e.g., Guggenheim 2006) or within value-entangled contexts (Kahan et al., 2010).

Kahan and collaborators have recommended several legislative and communicative strategies to reduce or overcome these challenges, synthesized from their own and other research. In the case of policy these include, in particular, identity vouching—the involvement and recommendation of culturally diverse figures in the shaping and selling of policies—and social-meaning overdetermination—the infusion of cultural values into, rather than secularization of, policy proposals, with the aim of making policy accessible to citizens of all cultural commitments (Braman and Kahan 2006; see also Kahan et al. 2011). To the extent that culturally entangled scientific conclusions bear directly upon policy (e.g., global warming and nuclear waste disposal), these strategies may temper the resistance to these conclusions among those of dissonant values).

One may, I think, complement these recommendations by considering research concerning corrective countering of misinformation. Lewandowsky et al. have argued that skepticism can serve as an attitudinal corrective to the biasing effects of worldview (Lewandowsky et al. 2012). For example, when primed with cues that prompt distrust, subjects performed better at non-routine reasoning (Schul and Mazurski 1990) or exhibited greater mental flexibility (Mayer and Mussweiler 2011). While the authors did not present research concerning the possibility and viability of continual, attitudinal skepticism as a preventive against misinformation, they nonetheless concluded that their results “suggest[ed] that a healthy sense of skepticism or induced distrust can go a long way in avoiding the traps of misinformation” (Lewandowsky et al. 2012).


Cultural cognition involves forces closer to our identities than most of the varieties of misinformation discussed in Lewandowsky et al.’s review; likewise, its influence is likely to be far more enduring. However, just as an awareness of common cognitive errors can help us guard against them, an appreciation of the subtle, but continual, influence of cultural values on our beliefs (even on the mechanisms by which we evaluate them), and of the types of values those beliefs are most likely to affect, should help us prepare and account for them. Indeed, an understanding of how we fail to converge on the evidence underlying our policy discourse seems essential to our enlightenment as a citizenry.



Bishop, G., Randall K. Thomas, Jason A. Wood, and Misook Gwon (2010). “Americans’ Scientific Knowledge and Beliefs about Human Evolution in the Year of Darwin.” Reports of the National Center for Science Education 30(3) pp. 16–18. Retrieved from

Braman, D. and Dan M. Kahan (2006). “Overcoming the fear of guns, the fear of gun control, and the fear of cultural politics: Constructing a better gun debate”. Emory Law Journal 55(4) pp. 569–608. Retrieved from 

Douglas, M. (1970). Natural Symbols: Explorations in Cosmology. London: Routledge. pp. 54–68.

Gallup (2008a). “Climate-Change Views: Republican-Democratic Gaps Expand.” Retrieved from

Gallup (2008b). “Republicans, Democrats Differ on Creationism.” Retrieved from

Gallup Politics (2012). “In U.S., 46% Hold Creationist View of Human Origins.” Retrieved from

Guggenheim, D. dir. (2006). An Inconvenient Truth. Lawrence Bender Productions. Film.

Kahan, D., Donald Braman, Geoffrey L. Cohen, Paul Slovic, and John Gastil (2010). “Who fears the HPV vaccine, who doesn’t, and why? An experimental study of the mechanisms of cultural cognition.” Law and Human Behavior 34 pp. 501–16. Retrieved from

Kahan, D., Hank Jenkins–Smith, and Donald Braman (2011). “Cultural cognition of scientific consensus.” Journal of Risk Research 14 pp. 147–74. Retrieved from

Kahan, D., Ellen Peters, Maggie Wittlin, Paul Slovic, Lisa Larrimore Ouellette, Donald Braman, and Gregory N. Mandel (2012). “The polarizing impact of science literacy and numeracy on perceived climate change risks”. Nature Climate Change 2 p. 732–735. Retrieved from

Lewandowsky, S., Ullrich K. H. Ecker, Colleen M. Seifert, Norbert Schwarz and John Cook (2012). “Misinformation and its correction: Continued influence and successful debiasing.” Psychological Science in the Public Interest 13(3) pp. 106–131. Retrieved from

Mayer J. and Thomas Mussweiler (2011). “Suspicious spirits, flexible minds: When distrust enhances creativity.” Journal of Personality and Social Psychology 101(6) pp. 1262–1277. Retrieved from

McCright A. and Riley E. Dunlap (2011). “The politicization of climate change and polarization in the American public’s views of global warming, 2001–2010.” The Sociological Quarterly 52 pp. 155–194. Retrieved from

National Science Board (2014). Science and Engineering Indicators 2014. Arlington VA: National Science Foundation (NSB 14-01). Retrieved from

Schul, Y. and David Mazursky (1990). “Conditions facilitating successful discounting in consumer decision making.” Journal of Consumer Research 16 pp. 442–451. Retrieved from

G@Gsmall Cory Brunson completed his PhD in mathematics in December and is a visiting research assistant at VBI. He studies algebraic geometry and network
topology, and is an advisor for the Freethinkers at Virginia Tech.

This entry was posted in Cory Brunson, Posts. Bookmark the permalink.

Leave a Reply