Early in How to Think I quote this passage from Daniel Kahneman’s Thinking, Fast and Slow:
Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely: “This number will be an anchor…,” “The decision could change if the problem is reframed…” And I have made much more progress in recognizing the errors of others than my own.
This seems like quite a distressing confession of failure from the person who has done more than anyone else to teach us about thinking! But Kahneman’s point here is that what he calls System 1 — the element of our thinking that “operates automatically and quickly, with little or no effort and no sense of voluntary control” — is “not readily educable.” It just does what it does, and we have little power to change it. But we do have power to recognize its work in us, and if we do, then we, like Kahneman himself, will be able to achieve a realistic assessment of our cognitive shortcomings. We can discern that Kahneman has internalized the results of his research pretty well when we notice how openly he acknowledges being wrong quite a bit.
When a group of scholars wrote in a blog post that Kahneman and his longtime research partner Amos Tversky had made some serious errors in their work on priming, Kahneman actually showed up in the comments to agree that their critique is sound:
What the blog gets absolutely right is that I placed too much faith in underpowered studies. As pointed out in the blog, and earlier by Andrew Gelman, there is a special irony in my mistake because the first paper that Amos Tversky and I published was about the belief in the “law of small numbers,” which allows researchers to trust the results of underpowered studies with unreasonably small samples. We also cited Overall (1969) for showing “that the prevalence of studies deficient in statistical power is not only wasteful but actually pernicious: it results in a large proportion of invalid rejections of the null hypothesis among published results.” Our article was written in 1969 and published in 1971, but I failed to internalize its message.
Perhaps it’s because Kahneman understands cognitive biases so well that he’s not surprised when he’s guilty of them. But there may be other forces at work too. For instance, it could be that Kahneman’s position in his field is so secure that he can’t damage it by admitting the occasional error. Yet that doesn’t seem to happen very often; rather, some of the most successful scholars are among the touchiest about criticism. I think we might have to fall back on the old notion of character: however that character was formed, Kahneman seems to have become a person who isn’t always “talking for victory,” as Samuel Johnson put it, and who doesn’t see himself as a glorious exception to a rule that covers the behavior of others. That’s rare, that’s commendable, that’s to be emulated.