“Why do people fiercely defend positions that go against decades of science-backed evidence?”
This is the question science communicator Michael Robin has set out to understand. With a keen interest in biotechnology, Robin has been digging into the psychology of belief. He delivered a presentation entitled UNBELIEVABLE! Science Communication for a Skeptical World at Innovation Place in Saskatoon during Global Biotech Week to explore these ideas, and share tips on what communicators should (and shouldn’t) do to gain the trust and acceptance of scientific concepts by a skeptical public.
We must be careful, says Robin, not to make the mistake of thinking that people who believe in things that are counter to scientific evidence lack intelligence. He has a particular dislike for the phrase “dumb it down.”
“It’s condescending. It implies the person you’re talking to isn’t smart enough to understand what you’re saying.” He says the phrase puts people on the defensive—not a good starting point.
He says there is little correlation between belief and intelligence. An intelligent person can still believe incredible things that have no basis in science. It’s not a question of intelligence, but rationality.
Robin quotes psychologist Barbara Drescher, who says rationality has four parts: intelligence, knowledge, need for cognition, and open-mindedness. Or, her ‘saltier’ version of this: “…we can be irrational because we are stupid, ignorant, lazy, arrogant, or some combination of those.”
No matter what level of scientific knowledge a person has, says Robin, they can be susceptible to irrational thinking. This is because of something called the “Belief Engine” which is an essential part of our thought process. Robin says that since we can’t be experts at everything, so we must “take it on faith” that certain things are true. “If we had to justify every single thing we do with evidence,” he says, “we would get nothing done.”
So, we make snap decisions, often based on inadequate information: we “go with our gut.” In the case of genetically modified organisms (GMOs), there has been so much negative propaganda for so many years that without doing a lot of homework, the average person may just think “better safe than sorry!”
To make our decisions, we consult trusted sources, often our friends. We tend to surround ourselves with like-minded people, who hold the same beliefs as we do—our ‘tribe.’ If we change our beliefs, we risk being shunned by our tribe. As highly social animals, this could be devastating.
Robin says confronting someone’s deeply held beliefs can make a person uncomfortable and even angry. For example, some people believe that fluoride in drinking water is harmful. Robin talks about a University of Calgary researcher who was attacked on social media when she presented data that clearly showed the benefits of fluoride. Even though the research is clear, myths, back by pseudoscience, prevail. And changing deeply held beliefs can cause “cognitive dissonance” which is a painful experience—quite literally. In fact, says Robin, studies show that in brain scans, cognitive dissonance will light up the same areas as with physical pain.
The typical method that has been used in the past to try to convince people of the safety and benefits of GMOs has been to recite the facts. This is called the “information deficit model,” says Robin, and has an unexpected effect: the receiver of the (unwanted) facts will double down and reject them even more strongly. This is called the “backfire effect.” This is not to say that we should stop sharing information. In fact, Robin points to a recent study by University of Regina Research Fellow Jon McPhetres on GMO attitudes which shows that if information is presented in a neutral and easy-to-understand way, it can have a positive effect on acceptance.
Robin says as complex (and often frustrating) it may be to get people to accept evidence-based information, there is hope. To communicate controversial ideas, he says we need to keep in mind that our audience is intelligent, but busy with their own lives. People make quick decisions on subjects where they don’t have all the information. We also need to focus on speaking to people who are not entrenched in their beliefs, that is, the “moveable middle.” To prevent people from getting defensive, we can use strategies such as speaking with the intention of being overheard, or “talking for the third ear.” As well, when you are in a conversation with someone who doesn’t accept your information, truly listen to them. Ask strategic questions, ask for clarification. “Your goal is not to change their position, but to get them to engage their logical decision-making process,” he says.
Robin says we don’t have to reinvent the wheel: There are many resources out there, people who have been working on the issues with science communications for many years, such as Joe Schwarcz from McGill’s Office for Science and Society; Stuart Smyth at the University of Saskatchewan, found at his blog SAIFood.ca; or GMO Answers, which is managed by CropLife Canada.
Robin included a list of tools for communicators to refer to, including a handy list by Terry Flynn and Tim Li (McMaster University) called 10 Ways to Combat Misinformation, and Carl Sagan’s “baloney detection kit” which includes “20 essential tools to keep you from stepping in it.”
Find Michael Robin on LinkedIn.