With 2019 approaching, I’ve been immersed in a book about how people make decisions: The Undoing Project, by Michael Lewis. He’s the author of Moneyball. You remember the Oakland A’s, back in 2002, had a losing baseball team and no money. To improve how they chose their draft picks, they brought in a statistics geek, who studied players’ performance and translated moves into numbers. When the A’s shifted to relying less on expert opinions and more on data, they started winning. But there was a backlash from the experts. This is an insult! How can some number-crunching outsider make better choices than me, who’s known the sport my whole life? Michael Lewis writes:
“In 2004, after aping Oakland’s approach . . . , the Boston Red Sox won their first World Series in nearly a century. Using the same methods, they won it again in 2007 and 2013. But in 2016, after three disappointing seasons, they announced that they were moving away from the data-based approach and back to one where they relied upon the judgment of baseball experts.” According to Lewis, the problem wasn’t failure of the approach; it was human nature. Everybody knew, because economists told them, that human beings make rational decisions based on enlightened self-interest. Using data fit that model: we get a realistic picture of the odds, and we choose the best match between what we want and what’s likely. But Lewis was seeing that people aren’t satisfied by probability. They “hunger for an expert who knows things with certainty, even when certainty is not possible.”
This was confirmed by the general manager of the Houston Rockets. Daryl Morey came to basketball from consulting. He also believed in data, and that got him in trouble on Wall Street: he wasn’t certain enough. “We’d tell our clients we could predict the 10-year price of oil; but no one can predict the price of oil. The best we can do is probabilities.” And his bosses said, “We’re billing these clients 500 grand a year, so you have to be certain.”
After he wrote Moneyball, Michael Lewis learned he wasn’t the first person to question the rational self-interest view of decision-making. A review of the book noted, “The . . . ways an expert’s judgments might be warped by the expert’s own mind were described, years ago, by a pair of Israeli psychologists, Daniel Kahneman and Amos Tversky.”
Who the heck were Kahneman and Tversky?
Well, it turned out Amos Tversky had won a MacArthur genius grant, and Daniel Kahneman won the Nobel Prize in economics. I found out about their work in 2011. Like Michael Lewis, I was so intrigued that I read Kahneman’s whole long, amazing book Thinking, Fast and Slow. At age 21, when Danny Kahneman was a new psychologist serving his year in the Israeli army, he was assigned to evaluate other new recruits. Watch them in an exercise and pick the best officer candidates—much like Moneyball. He said, “We noted who took charge, who tried to lead and was rebuffed, how cooperative each soldier was . . . who seemed to be stubborn, submissive, arrogant, patient, hot-tempered, persistent, or a quitter. . . . The impression we had of each candidate’s character was as direct and compelling as the color of the sky. . . .We were quite willing to declare, ‘This one will never make it,’ ‘That fellow is rather mediocre,’ or ‘He will be a star.’” Later Kahneman tested his predictions against the outcomes—how did these guys actually perform in officer training? He discovered his predictions were worthless. But because it was the army, and the assessments had to be made, he kept on making them.
Danny Kahneman realized this is a crucial quirk in human nature. Our judgment is not as reliable as we think, for a lot of reasons; but even when we know this, we rely on it anyway. He and Amos Tversky found this is true in all kinds of situations. Sports team managers, economists, financial consultants, judges, doctors—even scientific researchers, including them—have a tendency to fall in love with their own theories. “They fit the evidence to the theory rather than the theory to the evidence. They cease to see what’s right under their nose. Everywhere one turned, one found idiocies that were commonly accepted as truths only because they were embedded in a theory to which the scientists had yoked their careers.” Again, like Moneyball.
Where the blind spots in human judgment become dangerous is in fields like medicine, where patients are begging for certainty and doctors can thrive by pretending they have it. A study at the Oregon Research Institute found that when radiologists were shown X-rays of stomach ulcers, and asked how likely each one was to be cancerous, their diagnoses didn’t agree—and if you showed the same X-ray to the same doctor at two different times, he often didn’t agree with himself. Or in politics, where voters will reject a good candidate because of their own unrecognized biases, and choose the one who claims the loudest: “I know exactly what’s wrong, and I can fix it.” Or religion, where a church promises if you do A, B, and C, you’ll go to heaven, but if you do X, Y, or Z, you’ll burn in hell, even though in all of human history there is zero evidence that this is true.
One of Danny Kahneman’s great qualities as a researcher was he didn’t ask predictable questions. He didn’t say “Why do you believe this when it’s obviously wrong? Are you stupid or what?” He’d say, “We know people believe this, even when it’s obviously wrong; why is that? What kink in human nature is at work here, and how can we stop it from doing damage?” He and Tversky found that adult patterns of thinking are pretty hard to change. They did better with children. Kids are learning about everything for the first time. They don’t have so much skin in the game. If they make a mistake, they’re likely to think it’s funny and interesting, and go at it differently the next time.
But I do think that whether or not we older folks can rewire our own habits of making judgments and decisions, it helps to recognize these blind spots when we deal with other people. When a doctor or a politician or any other expert says, “Oh, I can fix that, no problem,” don’t just say “Oh, Thanks! That’s great!” Go home; check his record. Look her up on the Internet. Find other expert opinions.
Besides Moneyball, Michael Lewis also wrote the Wall Street book The Big Short. The movie version opens with a great line from Mark Twain: “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” But that ain’t so. Mark Twain never said that. The opening line of Lewis’s book is from Leo Tolstoy: “The simplest thing cannot be made clear to the most intelligent man, if he is firmly persuaded that he knows already, without a shadow of a doubt, what is laid before him.”
So as we welcome 2019, my resolution is Trust but verify. Don’t condemn people for their quirks and blind spots (that includes me); but watch out for opinions that aren’t backed by evidence. And have a happy new year.
This post originated as a meditation on 12/30/18 for The Church for the Fellowship of All Peoples.