In an era where skepticism of well-documented science has negative consequences for individuals and society, it is important to understand why people have such discordant attitudes toward well-documented science. A first step in such an endeavor is to determine what characterizes interindividual variation in attitudinal evaluations of science and technology. In the Public Understanding of Science (PUS) literature, much of the debate has focused on the relationship between factual knowledge of science and positive attitude; It is commonly found that less knowledge of facts is associated with more negative attitudes. A discussed mechanistic explanation for the association between attitude negativity and lack of scientific textbook knowledge is fear of the unknown .
However, it was recently reported that those who oppose genetic modification (GM) technology applied to foods and vaccines, although having low levels of understanding of the science ( objective knowledge ), nevertheless report that they do understand the science. ( subjective understanding ). The same has been reported more recently for a variety of well-documented scientific issues, as well as for anti-establishment voting patterns. This is consistent with previous evidence that what people think they understand about a topic is related to their attitude toward that topic. The fact that the overconfident, those whose self-assessed understanding exceeds their knowledge of the facts, are more prone to negative evaluations of science suggests that fear, disgust, or distrust of what they believe to be the case, rather than of the unknown, sustains his attitude.
The precise mechanisms by which overconfidence in science might lead to negative attitudes are unclear. Existing research has suggested the involvement of Dunning-Kruger type effects where the less competent also lack the ability to understand their limitations . However, it is not obvious that Dunning-Kruger effects are necessary or sufficient as an explanation in this context. They are not sufficient in the sense that, a priori , overconfidence could also lead to strong endorsement of a partially understood scientific consensus position. Motta and his colleagues, following Camargo and Roy, address this issue and argue that overconfident people may be unable to recognize both their own deficient understanding and the experience of others. In turn, overconfidence could lead to a confident rejection of reliable sources of information allied with an openness to misinformation, thus generating a strongly negative attitude towards the scientific consensus position.
We can also contemplate circumstances in which Dunning-Kruger effects are not a necessary condition. For example, if someone relies on the false belief that only genetically modified tomatoes contain genes, they might also believe that the modified genes could be transferred to them upon consumption, similar to how a pesticide on a crop could enter their system. This type of misconception , then, could easily lead to a strongly negative evaluation of GM technology in those who are misinformed. Such individuals would be classified as overconfident because their "textbook" scientific knowledge is weak , but their subjective evaluation of their understanding is high . If this is what is happening, then we do not need to evoke any inability to logically process and connect information: individuals simply firmly accept misinformation and make downward logical connections.
Summary People differ greatly in their attitudes toward well-documented science (evidence). What characterizes this variation? Here, we consider this topic in the context of genetics and related sciences. While most previous research has focused on the relationship between attitude toward science and what people know about it, recent evidence suggests that people with strongly negative attitudes toward specific genetic technologies (genetic modification technology ( GM) and vaccines) commonly don’t objectively understand the science, but, more importantly, believe they do . Here, using data from a probability survey of UK adults, we extend this previous work in 2 respects. First, we asked whether people with more extreme attitudes, whether positive or negative, are more likely to believe they understand science . Second, given that negativity toward genetics is commonly framed around issues particular to specific technologies, we ask whether attitudinal trends depend on the specification of the technology. We find (1) that people with strongly positive or negative attitudes toward genetics believe more strongly that they understand science well; but (2) only for those most positive to science is this self-confidence justified; and (3) these effects do not depend on the specification of any particular technology. These results suggest a potentially general model to explain why people differ in their degree of acceptance or rejection of science, being that the more someone believes they understand science, the more confident they will be in accepting or rejecting it . While there are more non-technology-specific opponents who also oppose GM technology than would be expected by chance, most of GM’s opponents fit into a different demographic. For the most part, opposition to GMOs appears not to reflect a smokescreen hiding a broader underlying negativity. |
Comments
Survey of over 2,000 UK adults identifies potential dangers of science communication
Why do people have widely varying attitudes toward well-documented science?
For many years, researchers focused on what people know about science, thinking that “to know science is to love it.” But do people who think they know science really know science? A new study published in the open access journal PLOS Biology by Cristina Fonseca of the Genetics Society, UK; Laurence Hurst of the Milner Center for Evolution, University of Bath, UK; and colleagues, finds that people with strong attitudes tend to believe they understand science, while neutral people are less confident. Overall, the study revealed that people with strong negative attitudes toward science tend to be overconfident about their level of understanding.
Whether it’s vaccines, climate change, or genetically modified foods , socially important science can evoke strong, opposing attitudes. Understanding how to communicate science requires understanding why people can have such extremely different attitudes toward the same underlying science. The new study surveyed more than 2,000 UK adults, asking them about their attitudes towards science and their belief in their own understanding. Some previous analyzes found that people who are negative toward science tend to have relatively low knowledge of textbooks but strong self-confidence in their understanding. With this idea as a foundation, the team sought to ask whether strong self-confidence underpinned all strong attitudes.
The team focused on genetic science and asked attitudinal questions, such as: "Many claims about the benefits of modern genetic science are greatly exaggerated." People could say how much they agreed or disagreed with such a statement. They also asked questions about how much they think they understand about such science, including: "When you hear the term DNA, how would you rate your understanding of what the term means?" All individuals were rated from zero (they know they don’t understand ) to one ( They are confident in understanding .) The team found that those at the attitude extremes (both those who strongly support and those who are against science) have high self-confidence about their own understanding, while those who respond neutrally do not have it.
Psychologically, the team suggests, this makes sense: to have a strong opinion, you must firmly believe in the accuracy of your understanding of the basic facts. The current team could replicate previous results and find that the most negative tend not to have high textbook knowledge. In contrast, those who are more accepting of science believe they understand it and scored well on the factual ( true/false ) questions in the textbook.
When it was thought that the most important thing for scientific literacy was scientific knowledge, scientific communication focused on passing information from scientists to the public. However, this approach may not be successful and in some cases may backfire . The present work suggests that working to address discrepancies between what people know and what they think they know may be a better strategy .
Professor Anne Ferguson-Smith, president of the Genetics Society and co-author of the study comments: “Tackling the negative attitudes towards science that some people have is likely to involve deconstructing what they think they know about science and replacing it with a more accurate understanding. “This is quite challenging.”
Hurst concludes: “Why do some people have strong attitudes toward science while others are more neutral? “We found that strong attitudes, both pro and con, are supported by strong self-confidence in knowledge of science.”