Why do some people continue to believe that climate change is not a threat despite the overwhelming scientific evidence that it is?
One theory is that the people who reject the evidence for climate change are doing so because they are uninformed. The idea is that if they were only a bit more educated on science they would suddenly realise the error of their ways. This is called the knowledge deficit model.
Researchers have sought to investigate how accurate this model is. They tested the hypothesis that a lack of knowledge about climate change predicted thinking it was less of a risk. They tested subjects scientific literacy and compared that with views on climate change.
If their hypothesis were true we would expect to see that risk perception would rise with more knowledge. It would look a bit like this.
In fact they found this.
There was no correlation between scientific literacy (as measured by a test) and belief that climate change was a risk. Some of those who believed climate change was a big problem knew a lot about science and conversely some who knew little about science thought the risk was great. In short, the knowledge deficit model was a bust.
However, when the researcher compared political affiliation with belief in climate change they found the correlation they were looking for. It seems that being left wing was a much bigger indicator of believing climate change was a risk and vice versa for being right wing and this was regardless of scientific literacy.
The researchers propose that when a subject, like climate change, becomes politically partisan, facts will do very little to alter opinions. David McRaney has dubbed this “tribal psychology“. When something becomes a matter of identity for your in-group, then the evidence ceases to matter very much.
Take something as simple as wearing face masks during a pandemic. A year ago it would have been hard to image people physically attacking each other over an issue like this, but here we are. McRany has an excellent Podcast on this subject. He notes that masks may now be a badge of group loyalty, like putting pronouns in your bio. These findings are especially interesting when thinking about research and teacher beliefs.
In education most people are already left wing, yet the battle between trad and prog teacher “tribes” rages on with almost daily outrages during which members can prove their group loyalty. Today, it’s language use in schools, before that it was exclusion and before that it was something else. Topics change, but the dance remains the same.
It’s very “natural” for people to pick a side and to stick to that side regardless. As McRaney notes, our evolutionary history as social animals means that group membership could have meant the literal difference between life and death. It’s also a very frustrating part of human psychology leading to irrationality, partisanship, class divisions, religious intolerance, and so on.
A large body of psychological research shows how when we form in and out groups we start to view those two sides very differently. Our side is hard done to, sensible, upstanding and decent. The other side is getting away with murder, idiotic, immoral and craven. If we make a mistake, it was an accident, if they do it was calculated. worst of all, McRaney notes that people would rather be wrong than out of good standing with their “tribe”.
If people’s identities are wrapped up in a certain sets of beliefs and if those beliefs form an in-group/out-group dynamic it is likely that everything will be viewed through that lens. it is also likely that, in this situation, no amount of evidence will change minds. A good example of this is phonics research, the results of which have been resisted for decades.
So here are some suggestions to try to avoid our natural tendencies towards “tribal psychology”. This advice is aimed mostly at myself, but if you find it useful, then great. I suggest these in the full knowledge I will probably be unlikely to follow them very well.
- Try not to become too invested in any one side
This is very difficult, but the more we see the other side as the enemy, the easier it is to start seeing everything through a partisan lens. Next, we may stop looking at the evidence all together and just side with our ‘tribe’.
- Try to find things you can agree with
Whenever possible and no matter how small it is, try to find points of agreement between you and people in your out group.
- Try to stay agnostic where possible
We would like the research to say “X” but it doesn’t exactly say “X”. Well, the best thing we can do is just not have an opinion. “I don’t know” is a valid opinion.
- Be dubious of research that supports your world view
Paper finds thing I want to be true is wrong *scoff* “what was the sample size?” “Let me check the methodology section!” “This author is right winger!”
Paper finds thing I want to be true is true! RETWEET without reading! “OK the sample size isn’t ideal but it’s a promising piece of research!” “who cares what the author’s politics are?!”
- Make appeals to people on their terms
Learning styles advocates never seem to get particularly fazed by the charge that the practice lacks evidence. They do, however, get upset by the idea that it pigeon holes students. My guess is learning styles advocates like the theory because it represents individualism and student centric learning. Questioning that claim resonates more than pointing out the lack of evidence.
Facts aren’t enough
Research tell us that the knowledge deficit model is flawed. If we can’t get people to believe in the threat of climate change despite the evidence, there is little hope of convincing them about anything else. A research based approach has to consider not only the facts that can be gleaned from research, but people’s feelings too. Bashing teachers over the head with research is unlikely to change minds but it may cause them to resent you. It’s then fairly easy for teaching experts to dismiss research as “irrelevant for teachers” and find a receptive audience.