If you study Onassis’s life, which I spent part of my early adulthood doing, you would notice an interesting regularity: “work,” in the conventional sense, was not his thing. He did not even bother to have a desk, let alone an office. He was not just a dealmaker, which does not necessitate having an office, but he also ran a shipping empire, which requires day-to-day monitoring. Yet his main tool was a notebook, which contained all the information he needed. Onassis spent his life trying to socialize with the rich and famous, and to pursue (and collect) women. He generally woke up at noon. If he needed legal advice, he would summon his lawyers to some nightclub in Paris at two A.M. He was said to have an irresistible charm, which helped him take advantage of people.
Let us go beyond the anecdote. There may be a “fooled by randomness” effect here, of making a causal link between Onassis’s success and his modus operandi. I may never know if Onassis was skilled or lucky, though I am convinced that his charm opened doors for him, but I can subject his modus to a rigorous examination by looking at empirical research on the link between information and understanding. So this statement,
Show two groups of people a blurry image of a fire hydrant, blurry enough for them not to recognize what it is. For one group, increase the resolution slowly, in ten steps. For the second, do it faster, in five steps. Stop at a point where both groups have been presented an identical image and ask each of them to identify what they see. The members of the group that saw fewer intermediate steps are likely to recognize the hydrant much faster. Moral? The more information you give someone, the more hypotheses they will formulate along the way, and the worse off they will be. They see more random noise and mistake it for information.
The problem is that our ideas are sticky: once we produce a theory, we are not likely to change our minds—so those who delay developing their theories are better off. When you develop your opinions on the basis of weak evidence, you will have difficulty interpreting subsequent information that contradicts these opinions, even if this new information is obviously more accurate. Two mechanisms are at play here: the confirmation bias that we saw in Chapter 5, and belief perseverance, the tendency not to reverse opinions you already have. Remember that we treat ideas like possessions, and it will be hard for us to part with them.
The fire hydrant experiment was first done in the sixties, and replicated several times since. I have also studied this effect using the mathematics of information: the more detailed knowledge one gets of empirical reality, the more one will see the noise (i.e., the anecdote) and mistake it for actual information. Remember that we are swayed by the sensational. Listening to the news on the radio every hour is far worse for you than reading a weekly magazine, because the longer interval allows information to be filtered a bit.
In 1965, Stuart Oskamp supplied clinical psychologists with successive files, each containing an increasing amount of information about patients; the psychologists’ diagnostic abilities did not grow with the additional supply of information. They just got more confident in their original diagnosis. Granted, one may not expect too much of psychologists of the 1965 variety, but these findings seem to hold across disciplines.
Finally, in another telling experiment, the psychologist Paul Slovic asked bookmakers to select from eighty-eight variables in past horse races those that they found useful in computing the odds. These variables included all manner of statistical information about past performances. The bookmakers were given the ten most useful variables, then asked to predict the outcome of races. Then they were given ten more and asked to predict again. The increase in the information set did not lead to an increase in their accuracy; their confidence in their choices, on the other hand, went up markedly. Information proved to be toxic. I’ve struggled much of my life with the common middlebrow belief that “more is better”—more is sometimes, but not always, better. This toxicity of knowledge will show in our investigation of the so-called expert.
THE EXPERT PROBLEM, OR THE TRAGEDY OF THE EMPTY SUIT