Are You Of Two Minds? Michael Lewis' New Book Explores How We Make Decisions | KERA News

Are You Of Two Minds? Michael Lewis' New Book Explores How We Make Decisions

Dec 6, 2016

We like to think our brains can make rational decisions — but maybe they can't.

The way risks are presented can change the way we respond, says best-selling author Michael Lewis. In his new book, The Undoing Project, Lewis tells the story of Daniel Kahneman and Amos Tversky, two Israeli psychologists who made some surprising discoveries about the way people make decisions. Along the way, they also founded an entire branch of psychology called behavioral economics.

Lewis is also the author of Moneyball, which is about trusting statistics over intuition to build a successful baseball team. He tells NPR's Audie Cornish what Kahneman and Tversky were looking for, and how the Obama administration has put their findings to use.


Interview Highlights

On how framing something as a loss or a gain can affect the way people make decisions

If you have a patient in a doctor's office who's just been told they have terminal cancer but there's this operation they could perform right now that might save their lives. ... They have a 90 percent chance of surviving the operation — if you tell them that, they respond one way. If you tell them ... that they have a 10 percent chance of being killed by the operation, they are about three times less likely to have the operation.

If you frame something as a loss — 10 percent chance of dying — as opposed to as a gain — 90 percent chance of living — people respond entirely differently. They make a different decision.

On what Kahneman and Tversky were searching for

I think their central question is, "How does the mind work?" Not when it's in an emotional state, but when it thinks it's in a rational state; when it's faced with a judgement or a decision to make, what's it doing? That's what they were interested in.

On how the Obama administration has put Kahneman and Tversky's findings to use

The Obama administration actually had, and has, a unit in it that's responsible for framing decisions in a way that leads people to maybe make better decisions. This unit, for example, has gone through all the federal pension plans, many of which were opt-in: required a worker to check a box if he wanted to save a certain amount of money. They changed these plans to opt-out: You had to check a box if you didn't want to save a certain amount of money.

Just that change in what's called the "choice architecture" leads people to save massively more money. And that was just one of the things that Danny and Amos' work revealed: the importance of framing.

On applying Kahneman and Tversky's findings to the 2016 election

I filtered the election through Danny and Amos. They did these wonderful unfinished studies about how the human imagination worked. They called it "The Undoing Project." And they studied, briefly, how people undid tragedy.

So after the election, people who found the result tragic were obeying some of the rules of the imagination that Danny and Amos described: focusing on the FBI director, for example, at the end. Danny and Amos had pointed out that when people endure a tragedy and they try to undo it in their minds so that they get to some alternative reality where it didn't happen, they start at the end and they undo the last thing that happened. So they undo the field goal kicker missing the field goal, and they undo the grounder going through the legs of the first baseman in the ninth inning. ...

Whereas, you know, you can think of a thousand different things that could have happened — and that were more probable [than the actions of FBI Director James Comey] — that could have ended with Trump not being president.

On what gave him the idea for his new book

When [Moneyball] was published it was pointed out to me in a review that the mistakes that people make when they're judging other people had been described well by these two Israeli psychologists in the work they'd done in the 1970s. And I was oblivious to it.

This book, if anything, is like the prequel to Moneyball. It explains why experts' intuitive judgments can go wrong and why you need to have data to rely on as a check against the judgments of these experts.

Copyright 2017 NPR. To see more, visit http://www.npr.org/.

AUDIE CORNISH, HOST:

We like to think our brains are able to make rational decisions, but maybe they can't. Take this example.

MICHAEL LEWIS: If you have a patient in a doctor's office who's just been told they have terminal cancer but there's this operation they could perform right now that might save their lives...

CORNISH: Author Michael Lewis says the way the risks are presented can change the way we respond.

LEWIS: They have a 90 percent chance of surviving the operation. If you tell them that, they respond one way. If you tell them all of that - that they have the 10 percent chance of being killed by the operation, they are about three times less likely to have the operation. If you frame something as a loss - 10 percent chance at dying - as opposed to as a gain - 90 percent chance at living - people respond entirely different. They make a different decision.

CORNISH: And Michael Lewis has been studying research on the human mind for his latest book "The Undoing Project." It chronicles the lives of two men who made that discovery and many others about the surprising way humans make decisions. In fact, Amos Tversky and Daniel Kahneman founded an entire branch of psychology called behavioral economics.

Michael Lewis' interest in these two psychologists came out of one of his other books, "Moneyball," which was about trusting statistics instead of intuition to build a successful baseball team.

LEWIS: When the book was published, it was pointed out to me in a review that the mistakes that people make when they're judging other people had been described well by these two Israeli psychologists in work they'd done in the 1970s. And I was oblivious to it.

This book, if anything, is like the prequel to "Moneyball." It explains why experts' intuitive judgments can go wrong and why you need to have data to rely on as a check against the judgments of these experts.

CORNISH: So they're Israeli psychologists - Daniel Kahneman and Amos Tversky. What was the kind of central question that fueled their research?

LEWIS: I think their central question is, how does a mind work not when it's in an emotional state but when it thinks it's in a rational state? When it's faced with a judgment or a decision to make, what's it doing? That's what they were interested in.

CORNISH: You've also talked about the influence that their research has today or had on the Obama administration (laughter) and the White House. Can you give us an example?

LEWIS: The Obama administration actually had and has a unit in it that's responsible for framing decisions in a way that leads people to maybe make better decisions. This unit, for example, has gone through all the federal pension plans, many of which were opt-in, required a worker to check a box if he wanted to save a certain amount of money.

They changed these plans to opt-out. You had to check a box if you didn't want to save a certain amount of money. Just that change in the - what's called the choice architecture leads people to save massively more money. And that was just one of the things that Danny and Amos' work revealed - the importance of framing.

CORNISH: Now, I can imagine that this election in particular (laughter) with the perception that data-driven journalism, like, focused too much on the polls and, quote, unquote, "got it wrong" must have been really interesting to you. I mean how did this book make you think about that conversation?

LEWIS: I filter the election through Danny and Amos. They did these wonderful unfinished studies about how the human imagination worked. They called it the Undoing Project. And they studied briefly how people undid tragedy. So after the election, people who found the result tragic were obeying some of the rules of the imagination that Danny and Amos described - focusing on the FBI director, for example, at the end.

They had pointed out - Danny and Amos had pointed out that when people endure a tragedy and they try to undo it in their minds so that they get to some alternative reality where it didn't happen, they start at the end, and they undo the last thing that happened. So they undo the field goal kicker missing the field goal, or they undo the grounder going through the legs of the first baseman in the ninth inning. Or they undo the last...

CORNISH: And you see that in the way, like, Democrats are focusing on the FBI director...

LEWIS: On Comey.

CORNISH: ...James Comey.

LEWIS: Whereas, you know, you can think of a thousand different things that could have happened and that were more probable that could have ended with Trump not being president.

CORNISH: How did this make you see your own work because you also do this in a way - try and see patterns in the past, try and seek order maybe where it's not always clear there's order. I mean that's just being an author.

LEWIS: You're exactly right, and it makes me skeptical of my own work. And when I was working on this book and writing it, I was hoping the reader all along was going to take Amos and Danny's spirit with them and think about everything I was doing with it and thinking about the patterns I was finding and whether these were true patterns or false patterns because it's not that all patterns are false. It's that we can be misled by false patterns. So it's given me a heightened awareness of the temptation that the mind presents you with to find a story that really isn't true.

CORNISH: Well, Michael Lewis, thank you so much for sharing this research with us.

LEWIS: Thanks for having me.

CORNISH: Michael Lewis' new book is called "The Undoing Project." It's out now. Transcript provided by NPR, Copyright NPR.