“A man with a conviction is a hard man to change. Tell him you disagree, and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.” - Leon Festinger 1957
Over recent times, new findings in psychology and neuroscience have beyond demonstrated how our established beliefs, even more than any new figures, can distort our thoughts and even alter what we consider our most impartial and logical conclusions. This tendency toward the so-called “motivated reasoning” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, man on the moon, flat earth and so on.
Does rejection of science have something to do with our nature?
Totally, if we think about it, we tend to push threatening information away while we welcome friendly information, the ones which are related to our comfort zone. We apply fight-or-flight reflexes not only to predators and to whom wish us bad but also to data and scientific research outcomes.
We call it reasoning but it is actually rationalizing. In other words, when we think we are reasoning, we may instead be rationalizing. Or to use an analogy: “We may think we’re being scientists, but we’re actually being lawyers”(Jonathan Haidt 2012).
Why then, there exists such thing as progress?
Contemporary science derives from an attempt to remove such subjective lapses which is exactly what that great 17th-century theorist of the scientific method as Galileo Galilei pioneered. The effect is that even if individual researchers tend to idolize their own theories, the wider processes of peer review and institutionalized scepticism are designed to ensure that, eventually, the best proposals prevail.
The problem is that not necessarily an anti-vax or climate change critic is anti-science. It is just that in their own minds, “science” is whatever they want it to be.
Can the problem be addressed with more education?
More education is never detrimental to shaping an idea. However, sometimes it can backfire or be counterproductive, and it is not always a synonym of more rational behaviour. For instance, in a 2008 Pew survey, a higher education correlated with an increased likelihood of denying the science on the issue.
Even Nobel Prized and highly successful scientists fall in the trap of thinking that they are smarter than everybody else. Indeed, several talented scientists developed bizarre ideas in their later years and became uncritical toward their own logic. The fact that many Nobel laureates are among them has even led to the term “Nobel disease” .
Pure and simple confirmation bias
The tendency that people holding irrational beliefs were also more likely to perceive patterns in randomly generated coin tosses or in chaotic, unstructured paintings. In our case, we see it when we perceive meaningful relationships between events even if they are only random occurrences. Humans have a general propensity to adopt opinions that confirm their pre-existing beliefs while ignoring or rejecting anything that generates doubts.
People who believe, for instance, that GM crops or vaccines are unsafe will accept any information that reinforces their fears while discarding everything else. They will eventually develop elaborate rationalizations, structured stories even full of details, often lacking any logic. Note that the real problem is not that people should not think, the problem is that once a conspiracy thought is triggered it is difficult to get rid of it and one of the reasons might be that people most of the times believe what they are willing to believe.
How can restrict the magnitude of such ideas and ensure they remain a minority?
we are much better at seeing the weaknesses in the arguments of others than in our own. If we are open‐minded though, a good discussion will produce reliable results. Have you noted that when in any discussion, the initial arguments are essentially weak statements, but they become more refined and supported as the discussion evolves? Reasoning on your own does not get you as far because you do not know the reasons of others.
A simple solution could be to build more trust in science by giving the public a more convincing picture of it. It is at this stage of our history that everybody understands that science is not 100% objective, and that each individual scientist can be biased and conditioned just like every other human being. However, in the aggregate level, the process is very efficient.
Agreement among Scientists is not always present. But that should not be a problem, rather the opposite. It means that they are truly committed in finding a consensus.
Author:
Pietro Fadda - Editor in Chief
References
Leon Festinger (1957) - A Theory Of Cognitive Dissonance
Jonathan Haidt; The Righteous Mind: Why Good People are Divided by Politics and Religion Penguin; 1° edition(29 march 2012)
Scientific method https://www.onlineeducation.com/features/galileo-galilei
Nobel Disease: https://skepticalinquirer.org/2020/05/the-nobel-disease-when-intelligence-fails-to-protect-against-irrationality/
留言