A lot of different opinions circulate through society about the ongoing corona crisis. One thinks that the government imposed ridiculous measures, another believes that the government is being far too lenient and maybe you feel that it is fine this way. Even among researchers, there is still a lot of dissension. Where does this radical difference of opinions come from if everyone has the same data?
Motivated reasoning
Obviously, fighting the coronavirus is a complex problem and therefore, there is not one simple solution. However, this polarity among experts in the field can partially be explained by a psychological concept. This concept is called motivated reasoning. It is based on the view that preconceived ideas affect your judgement. According to this theory, people are biased toward a decision that conforms to what one already knows. This often happens unconsciously, one still thinks that one is being completely rational. An example of this is a football supporter who believes that the referee is prejudicial to his team. This supporter often only looks at the mistakes made by the referee that disadvantage his team and not at the mistakes that impediment the other team. In this way, this person is only looking for things that confirm his view and discards most of the counter-evidence. Similarly, some researchers tend to “search” for results that support their ideology about the virus and cease too early researching other explanations.
“If we are uncritical we shall always find what we want: we shall look for, and find, confirmations, and we shall look away from, and not see, whatever might be dangerous to our pet theories.” ~ Karl Popper
Bias of mathematicians
Dan Kahan is a professor at Yale University who has been studying this concept for years. In 2013, he performed an interesting experiment (Kahan et al., 2013) that compared the bias of people of different mathematical levels. In this experiment, a group of people was told that they got the results from a test of a new skin cream that could potentially help cure a rash. Based on these results, the people had to choose if the skin cream seems to cure the rash or only makes it worse. The results of this fictional skin cream test are shown below. This might be a nice moment to determine for yourself what would be your answer before you continue reading the article.
The right answer is that it seems like the cream only worsens the rash. This is due to the fact that the rash of about 75% of people who used the cream improved whereas the rash of about 84% of people who did not use the cream got better. A lot of people who made the test immediately focused on the large number 223 in the table and therefore gave the wrong answer. As expected, this happened mostly among people who also scored low on math tests. About 40% of these people got the right answer. People with high mathematical ability performed a lot better and about 80% got the right answer. This is an expected result and it is not related to motivated reasoning yet.
In that same experiment, Kahan gave the same data to another group of people but he told them a different story. In this case, he told them that this data was gathered from cities that either banned or did not ban carrying guns in public and whether the crime rate increased or decreased in that city. Participants now had to decide whether banning guns actually decreases or increases the crime rate. In contrast to the previous example, this is a very controversial topic in the US. The contingency table given to the participants is again shown below.
Furthermore, the participants were asked about their political preferences. This tells us something about their “preferred answer” since Democrats, in general, want to ban guns and Republicans do not. Now, in the above table, the correct answer is that banning guns only increases crime. The people who were bad at math, still scored the same as in the previous example, about 40%. This also holds for people who were good at math, unless it went against their political stance. The Republicans, who favored this option, with high mathematical ability, scored about 80-90%. However, the Democrats who were good at math only scored about 50%. This difference is a lot bigger than you would expect from people with the same mathematical ability. Furthermore, when they changed the answer such that banning guns decreases crime, the Republicans performed very bad and the Democrats very well. Surprisingly, the difference between the groups became larger as people became better at math. This suggests that people who are good with numbers might be even more susceptible to the influence of their underlying ideology. You might view yourself as a smart econometrician, someone who thinks rigorously and rationally, but this is perhaps not entirely true.
Thinking, Fast and Slow
In 2011, Nobel Prize laureate Daniel Kahneman published the book Thinking, Fast and Slow. If you are interested in economics and psychology, I would really recommend you to read this book. It uses some profound psychological concepts to discuss how people make decisions. Another article that dives into the work of Kahneman can be found here.
Daniel Kahneman
Kahneman explains that we have a Two System way of thinking: System 1 (thinking fast) and System 2 (thinking slow). System 1 is your intuitive system with which you make quick decisions and is often the reason why we jump to conclusions. System 2 is the system that does your analytical and critical thinking. Kahneman shows that we spend most of our time in System 1. This is definitely not something bad, since this system is generally very accurate. For example, it makes sure you immediately press the brake when an accident happens in front of your car. However, this system always tries to create a quick, coherent explanation for what is happening, and when doing this it relies on memories and assumptions. Due to this, System 1 also causes a lot of biases and is the reason for motivated reasoning.
The solution
Now that you realize that you are also making biased decisions, you are probably wondering how to resolve this. According to Kahan, the first step in the right direction is to understand and accept that you are fallible. In this way, you are more aware of your decisions and it also becomes easier to ask questions to yourself to detect these prejudices. Furthermore, it is very important to speak with people with different opinions. Their perspective might give you more insights into the topic. This is also why peer review is so important in academics. Since you, as a researcher, might be so focused on a particular result, you are more likely to overlook other explanations. For your colleague, who can have a fresh, neutral view on your research, it is much easier to detect these flaws. Finally, according to Kahan, the most important thing is to stay curious. Keep reading new articles, speak with new people, and stay open for new perspectives. We can never get completely rid of our preconceptions, however this is a big step in the right direction. So, I would advise you, even if you view yourself as a rational and intelligent econometrician, to stay aware of your biases and to stay open for controversial opinions.
Source: Kahan, Dan M., E. Peters, E. Dawson, and P. Slovic. “Motivated Numeracy and Enlightened Self-Government.” (2013).
Dit artikel is geschreven door Stan Koobs