9

Is there scientific evidence that people who read a list of cognitive biases in a psychology book or in Wikipedia improve their reasoning abilities. Is there evidence that they will less likely fall victim to the biases they read about?

Christian
  • 33,271
  • 15
  • 112
  • 266
  • I remember reading [it's not that great](http://lesswrong.com/lw/9p/extreme_rationality_its_not_that_great/) and that [it could be great](http://lesswrong.com/lw/9t/extreme_rationality_it_could_be_great) on LessWrong, but I know of no studies. – Borror0 Mar 07 '11 at 01:01
  • 2
    Related XKCD: http://xkcd.com/552/ – Andrew Grimm Mar 07 '11 at 12:02
  • 4
    It's a practiced skill, and one that isn't easy. After all, these behavior patterns are pretty-much hardwired by millions of years of evolution. In certain circumstances though, yes, you can. For instance, I have reconditioned myself with regard to the bystander effect. When I see someone that needs help, and there are lots of people around, I assume that no one will help them, so I try, but that is a relatively easy one to overcome. I think that you are more likely to realize your mistakes and biases better after the fact though, and maybe fall for advertisements less. – Ustice Mar 07 '11 at 15:22
  • Why not? How else should avoiding bias work? Well - maybe it doesn't work at all, but didn't we do progress in questions of racism, sexism, nationalism and so on? I don't see an alternative to enlightement. (We, mankind, the democratic world). – user unknown Mar 09 '11 at 03:43
  • @user unknown: It's quite plausible that we have less racism because the average person feels more empathy with black people. The average person knows more foreigners and now we have less nationalism. – Christian Mar 09 '11 at 11:16
  • @user unknown I fear that progress hasn't been done due to rational thinking. Rather it's due to a shift in general perception of moral values, possibly brought about by advancements in living conditions and wider spread of information. I do hope rationality plays some role in it, at least by providing a tipping point for arguments when a social change is ready to happen. – Ilari Kajaste Mar 10 '11 at 20:30
  • From my experience, I would say it helps. But in an indirect sort of way. Reading such lists helps me be more aware of them, so I can adjust my thinking. But really, the reconditioning @Ustice talks about seems to be key. These days whenever I find myself thinking of a rationale behind some phenomenon, my second instinct is to consider whether I'm biased, and in what ways. Yet the biased instinct does still remain first, and does affect what route I'll think. Then again, this all is pretty much just unjust generalization from a personal, possibly confirmation-biased anecdotes... ;) – Ilari Kajaste Mar 10 '11 at 20:42
  • Does reading about condoms reduce STDs? The mere act of reading will not improve reasoning, you need to practice *reasoning* to become better at it. – oosterwal Mar 11 '11 at 04:12
  • To expand upon my earlier comment about how it is a learned skill, I was reading an article that was defending a political position counter to my held belief, and I noticed that I was more skimming the article because my initial thought was that it was wrong, but then I thought that I falling to the confirmation bias, and reread the article. There was good info, and it helped my understand the subject better. The key is to practice. I'm sure that as I use the skill to second-guess myself, I'll get better. – Ustice Mar 14 '11 at 16:20
  • In his book, "Thinking, Fast and Slow" Daniel Kahneman repeatedly states that he is not very good at recognizing his own biases, even though the study of such things is his career. – Larry OBrien Mar 20 '12 at 01:20

3 Answers3

5

It's a little bit of a catch 22. When we tend to feel strongly that we are right, we quieten down the part of our brains that signal we might be wrong:

So it follows that if we are confident we know all the biases because we're well read on the topic, we could very well be leaving ourselves open to an as-yet unknown bias or some other weakness in our reasoning.

As a skeptic, by definition, I think it always helps to question, even the things we hold to be true.

Suggested reading: http://www.jonahlehrer.com/books

UPDATE: in response to Timwi - reference Fischhoff, B. (1982). Chapter 31 - Debiasing. In D. Kahneman, P. Slovic, & A. Tversky (Eds.), Judgment Under Uncertainty: Heuristics and Biases

  • Interesting insight, but this provides no answer to the question, which is asking for scientific evidence. – Timwi Mar 09 '11 at 23:56
4

I would surmise that it would only be effective if the person reading about this actually recognizes it in themselves.

JasonR
  • 9,247
  • 5
  • 51
  • 65
3

When I teach reasoning in the classroom, I frequently demonstrate the confirmation bias to my students. What I find is that they are able to overcome their bias on basic questions relatively quickly. They don't do this simply because they are told about it, however. It takes demonstrations and practice to start to get the hang of it. When Wason examined reasoning and the confirmation bias in human subjects, he found that they would be quite slow to recognizing their reasoning error when given simple feedback about their choices. Nevertheless, people do show improvement. In my own research lab we are anticipating examining how long it takes for people to figure out their own reasoning errors in order to explicitly examine how training can directly impact reasoning.

Interestingly, there are instances when we appear to reason very naturally. Lena Cosmides has evidence that we avoid the confirmation bias when reasoning about social contracts. She attributes this ability to an evolved social reasoning process. Some of my own research (still under way) shows similar effects for non-social, evolved processing.