The article argues that some experts do no better than chance in making expert decisions. Indeed, some decisions are influenced by external factors such as price and reputation. In other words, the pronouncement of quality for a wine could be predicted by the price and the label on the bottle.
Daniel Kahneman writes that:
Earlier I traced people’s confidence in a belief to two related impressions: cognitive ease and coherence. We are confident when the story we tell ourselves comes easily to mind, with no contradiction and no competing scenario. But ease and coherence do not guarantee that a belief held with confidence is true. The associative machine is set to suppress doubt and to evoke ideas and information that are compatible with the currently dominant story. A mind that follows WYSIATI [What You See Is All There Is] will achieve high confidence much too easily by ignoring what it does not know. It is therefore not surprising that many of us are prone to have high confidence in unfounded intuitions. Klein and I eventually agreed on an important principle: the confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trust anyone— including yourself— to tell you how much you should trust their judgment.
Kahneman, Daniel (2011-11-03). Thinking, Fast and Slow (pp. 239-240). Penguin UK. Kindle Edition.
Kahneman postulates there are two (2) basic conditions for devloping skilled intuition:
- an environment that is sufficiently regular to be predictable
- an opportunity to learn these regularities through prolonged practice
When both these conditions are satisfied, intuitions are likely to be skilled. Chess is an extreme example of a regular environment, but bridge and poker also provide robust statistical regularities that can support skill. Physicians, nurses, athletes, and firefighters also face complex but fundamentally orderly situations. The accurate intuitions that Gary Klein has described are due to highly valid cues that the expert’s System 1 has learned to use, even if System 2 has not learned to name them. In contrast, stock pickers and political scientists who make long-term forecasts operate in a zero-validity environment. Their failures reflect the basic unpredictability of the events that they try to forecast.
Kahneman, Daniel (2011-11-03). Thinking, Fast and Slow (p. 240). Penguin UK. Kindle Edition.
This leaves the question of why the expert wine tasters were performing no better than chance. They have a predictable and regular environment, and they, presumably, had a great deal of practice. Those two (2) conditions of Kahneman’s are necessary but not sufficient. Something else is needed.
Kahneman had written earlier about the inconstency of expert judgements and noted that:
The widespread inconsistency is probably due to the extreme context dependency of System 1. We know from studies of priming that unnoticed stimuli in our environment have a substantial influence on our thoughts and actions. These influences fluctuate from moment to moment.
Kahneman, Daniel (2011-11-03). Thinking, Fast and Slow (p. 225). Penguin UK. Kindle Edition.
The priming effects of the price and the label overrode any contradictory evidence from the wine taster’s palete.
And, yet, I find that I am called upon to give expert opinions on things that are, essentially, unpredictable. I think my managers want reassurance in the face of uncertainity. No wonder there is Strong Need for Charlatans.
Builing up expertise takes time and a lot of pain. Pain in the analysis and making mistakes. Some wag once said that an expert is one who has made all of the possible mistakes in their field.
And the most important attribute to develop is the recognition that one does not know. The difficulty is in overcoming the Dunning–Kruger effect which is:
…a cognitive bias in which unskilled individuals suffer from illusory superiority, mistakenly rating their ability much higher than average. This bias is attributed to a metacognitive inability of the unskilled to recognize their mistakes.