2019 Conference

Watch Video

Thinking in Bets

Presented By:

Annie Duke - Author, Thinking in Bets, Former Professional Poker Player

Download Slides


In poker and throughout our lives, we should try to maximize the time we spend in favorable situations and minimize our time in unfavorable ones. Poker players are too quick to quit when they are winning. They look for any excuse to put the session in the (non-existent) win column, and the same players will refuse to quit a losing game. The same thing happens outside poker: sales professionals not giving up on a dead lead; investors unwilling to sell their losing investments. Annie Duke examines how the interaction of many cognitive biases (including loss aversion and sunk-cost bias) drives this behavior. These tendencies cause us to miss good opportunities and continue playing when the odds are against us. She provides strategies that prevent us initially making these poor decisions and how to take a longer term view. The strategies apply in the workplace, to parenting and to other personal decisions.


Annie Duke:    You can get in touch with me at annieduke.com. There's a contact form there. You can also see my newsletter there. The newsletter is supposed to go out once every week. I'm under a deadline right now for my next book so it's a little slower. But you can go and look at the archives there and then subscribe if you want. If you like the content of this talk, then that newsletter is all this kind of content. So hopefully you'll go and look at that. And then, also, you can contact me. I love to hear from people who I've spoken to.

Annie Duke:    So just a little bit more color about me. I started off my adult life as an academic. I was at the University of Pennsylvania. I was working on getting my PhD in cognitive science. I was in particular studying learning, particularly learning under uncertain systems. And I was on my way. In fact, on my way to my first job talk at NYU, when an illness that had been somewhat chronic became cute, I ended up in the hospital for two weeks. Hard to do a job talk from a hospital. So I had to cancel actually all of my job talks which were during that two-week period and take some time to recuperate.

Annie Duke:    When I was at UPenn, I was living under a National Science Foundation fellowship and I discovered that when you take time off from school, they take your fellowship away. So I needed to figure out a way to make money in the meantime during that year that I had planned to take off before I went back out to become a professor. So just out of the need for money, I started playing poker. Seems a little strange, particular in the '90s when it was not on television. But my brother was already a professional player. He was the one who suggested that maybe this would be a good way to make money. So off I did that. In the meantime, turned into 18 years, which is I'm sure exactly what the National Science Foundation had in mind when they gave me that fellowship to go study cognitive psychology. I think it worked out pretty well.

Annie Duke:    But about eight years into playing poker… I retired in 2012. About years in, I got asked actually by a group to come and talk about how poker might inform people's decision making. And this was the first time that, in a very explicit way, I had thought about the way that the academic work that I had done at Penn really talked to and married with this very interesting problem that I was seeing at the poker table. And the jumping-off point for this type of thinking and this very first talk that I ever gave in 2002 was this, that in my introductory psychology course, and I'm not talking about in graduate school, I'm talking about an undergrad in my introductory psychology course, I had learned this particular thing, which I'm sure that anybody here who has taken an introductory psychology course will have learned as well, that learning occurs when you have lots and lots of feedback tied closely in time to decisions and actions.

Annie Duke:    So we could think about the classic rat in a box pressing a lever. They press the lever, out comes a pellet. The rat figures out that the lever pressing causes the pellet to appear, and they learn that connection. So this is what I had learned in undergrad. But then, I went and started to play poker. And I said, "Well, this is very interesting." Because I can't think of a human activity where there is more feedback tied more closely in time to decisions and actions than poker. One hand of poker takes approximately about two minutes. There's up to 20 decisions. Every decision gets some sort of feedback associated with it. I bet, I find out if you call or you raise or you fold, or whatever it might be. The end of the hand results in an exchange of chips. And so on and so forth. I mean this is so many decisions, so many actions. And you're getting so much feedback.

Annie Duke:    And yet what I saw was people really don't learn very well at the poker table. They're actually quite poor at it. I came in as a new player, playing with people who'd been playing for 20 years. And they played the exact same as they had played when they first started basically. And I thought, "Well, this is kind of a puzzle." And I then decided to spend some time trying to undo this puzzle. For 10 years, I was really thinking about this. And then in 2012, I actually retired to spend all of my time thinking about this, and somehow finally made my way back to academics. So there you go, full circle.

Annie Duke:    So that's what we're really going to talk about today is this problem of what is it that's tripping people up in this spot and how can we maybe start to address it. So does anybody know what situation this is? By the way, I never put this up without people laughing, which I think is really interesting. Does someone want to just describe this situation? Anybody?

Speaker 2:    It's the Super Bowl.

Annie Duke:    The Super Bowl. We know it's the Super Bowl because New England's in it. That's all you need to know, "Oh, New England, then it must be a Super Bowl." Right. So this is 2015, Super Bowl XLIX. And we've got the Seahawks are playing New England, and they're down by four. So in order to go ahead here, they need to score a touchdown. The problem for the Seahawks is there's only 26 seconds left in the whole game because it's the fourth quarter. They're on second down so they have only a maximum of three tried at the touchdown. But with 26 seconds left, that's going to be kind of tough, particularly because they only have one timeout. The good news for them is that they're on the one-yard line.

Annie Duke:    All right, so there's an expectation here of the call that Pete Carroll's going to call here. And they expectation is that he's going to have Russell Wilson, the quarterback in the middle there, hand the ball off to Marshawn Lynch who's standing next to him. Marshawn Lynch is one of the greatest running backs of all time. He's called The Beast. And Marshawn Lynch will, I guess, obviously just barrel through that line of Patriots. So that's the expectation. But I think that most everybody knows Pete Carroll doesn't call that play, he calls something very unexpected. And he has Russell Wilson pass the ball to the corner of the end zone down here. The ball is very famously intercepted by Malcolm Butler. Sad. And they lose the game. And let's hear what the announcer has to say about it. The play is going to happen very quickly. I'm not so worried about that. I'd like you to listen to what the announcers say.

Al Michaels:    Play clock at five. Pass is intercepted at the goal line by Malcolm Butler. Unreal.

Cris C.:    They tried a pick play, Al. They tried to go here but he beats them to the punch. And I'm sorry, but I can't believe the call.

Al Michaels:    Me neither.

Cris C.:    I cannot believe the call. You've got Marshawn Lynch in the back field. You've got a guy that's been borderline unstoppable in this part of the field. I can't believe the call.

Al Michaels:    It's got to be one of the dumbest calls offensively in Super Bowl history.

Annie Duke:    All right, well what you mostly heard there was Cris Collinsworth saying, "I can't believe the call," over and over again. Now, did anybody, when they were listening to Cris Collinsworth call that play, did anybody hear any statistics in there? Did you hear anywhere what the chances of an interception in that situation are? No. Do you think that would be a helpful piece of information to know? Sure. So that's interesting. I mean but to give Cris Collinsworth a little credit here, he's doing it in game, so maybe he doesn't have his slide rule right there and he can't pull it out. But obviously once people have some time to actually think about this play, you're going to see that they're going to start bringing in some of this information that you might need. And maybe we're going to see a more measured view of what the quality of that play is.

Annie Duke:    Oh no, it gets worse. So here are takes from the next day. So obviously these people have some time to think about the play. And what you can see is it's kind of a battle of was it the worst play call in Super Bowl history or just the worst play call in all of NFL history, period? So people clearly don't like this play. Now, here's what Pete Carroll had to say about it. This was on Good Morning America. He was being pressed to essentially sort of prostrate himself in front of the country with a mea culpa here. And on being pressed to say that it was indeed the worst call in Super Bowl history, he says, "It was the worst result of a call ever." That emphasis is mine. And he adds, "The call would have been a great one if we catch it." So I agree with him here. And then he says, "It would have been just fine, and nobody would have thought twice about it." That's where I disagree with him.

Annie Duke:    So I'd like us all to take a moment to do the thought experiment. Let's imagine that Pete Carroll does this very strange thing. Right? It's very unexpected. Everybody thinks he's going to hand the ball off to Marshawn Lynch, and instead he passes the ball to the corner of the end zone and it's caught for the game-winning touchdown. What do you think the headlines look like the next day? Anyone want to take a stab?

Speaker 5:    Brilliant.

Speaker 6:    Carroll's a genius.

Annie Duke:    Brilliant, genius. Exactly. All of those words. Out-Belichicked Belichick. So, in fact, it's not that they wouldn't have thought twice about it, they would have done just the opposite. They would have honed in on that play if it turns out well, and they would have talked about what a genius Pete Carroll is. Now, here's the thing, we don't need to know how it turned out in that moment to figure out whether the decision was good or bad. Because once Pete Carroll calls the play, the quality of the play call is set. It's only one try. So on that one try, whether it's intercepted or whether it's caught for a touchdown, actually from a rational standpoint shouldn't change our opinion about the play too much. But we can see that it does. And, in fact, we don't actually need to do the thought experiment. Because two years later, we actually get a very good example of how much the outcome matters to the way that we view the decision.

Annie Duke:    Okay, so the Patriots are here so we know it's a Super Bowl again. I'm saying, right? It's correlated at one basically. In this particular case, the Eagles… Go Eagles. I'm from Philadelphia. The Eagles are up by three against New England. This happens to be the end of the second half, and it's fourth and goal. So they've only got one play here. It just turns out that the mathematically correct thing to do here is actually to go for it, to go for a touchdown. Doug Pederson does the mathematically correct thing. The expected play is a field goal but he does something unusual but correct. And what you're going to see is Nick Foles, the quarterback, is going to end up in the end zone catching this ball. So this is unexpected in a lot of different ways. We're very lucky to have Cris Collinsworth announce this one for us as well. You know, Cris Collinsworth is a consistent guy, so obviously he's going to say, "I can't believe the play," over and over again. And now we can listen to him say that.

Speaker 7:    Fourth and goal. Uh-oh. And they're going to snap it and it's Trey Burton who throws. Caught. Foles, touchdown.

Cris C.:    Here we go. They've got all these signals. Shotgunning would not have been eligible but for the fact he was in the shotgun, not under center. And there's the old basketball player. They talk about Nick Foles, he's more comfortable playing quarterback like a point guard. And here he is receiving the pass that is going to be shown a billion times.

Annie Duke:    Okay, I was wrong about what he said. So what we can see is that here we have the thought experiment in action. In one case, it's a very unexpected play that fails. Cris Collinsworth thinks it's awful. The press agrees. In another case, it's a very unexpected play that succeeds. Cris Collinsworth thinks it's great. The press agrees. So what's going on? This is called resulting. So resulting is a mental shortcut that we take where basically we say, "If I know what the quality of the outcome is, if I know how it turns out, that tells me what I need to know about the quality of decision." So we act like we can work backwards to get from outcome quality, which we can clearly see, interception, touchdown, right? We can see what the difference between those two things is. And that tells us what the quality of a decision is.

Annie Duke:    Now, why do we do this? Well, because it's really, really hard to unpack what the quality of a decision is. It's complicated. We don't have a decision tree appear right in front of our faces so that we can just quickly work out the math. We're not living in the matrix where all the code is coming down in front of us. And to give you an idea of how complicated it is to understand that Pete Carroll play, I'll just really quickly, so I don't leave you hanging, just tell you a couple of things about that play.

Annie Duke:    I agree that you should get the ball to Marshawn Lynch twice. But the problem is that Pete Carroll only has 26 seconds left. So if he hands the ball off to Marshawn Lynch… And remember, he has one time out. If he hands the ball off to Marshawn Lynch and they fail to get through that pile of Patriots, which, by the way, is going to happen most of the time. You're mostly going to fail, and sometimes succeed. If he fails, what's happening? The clock is running. Pete Carroll has to call his time out, and now they could hand it off to Marshawn Lynch again. And they're going get into the end zone twice there. They're going to try for the end zone twice.

Annie Duke:    But what if he passes first? If he passes first, there's three outcomes from that, aside from fumbles and sacks which are all things being equal. There's three things. Touchdown, he's a genius. Dropped pass, the clock stops really fast. What do they get to do then? Hand it off to Marshawn Lynch twice again. Or an interception. So essentially by throwing a pass somewhere in there, you're getting an option at a third play, which when you're trying to win the Super Bowl against the Patriots is a really important thing. So the question then becomes, what's the cost of the option? And the cost is the interception rate. However often that ball is intercepted is the cost of the option for a third try at the end zone. That's what we need to know in order to evaluate the quality of this play. The cost of that option is somewhere between 1% and 2% of the time that that ball gets intercepted.
Annie Duke:    Now, have I sort of made people think twice about the way that they view that play a little bit with that? Yeah, okay. Good. All right. But notice what I just had to do. I just had to walk you through a whole bunch of stuff. I had to talk about probabilities. I had to discuss a little options theory with you. This is very complicated stuff. So as human beings, what do we do? We say, "Well, I know how it turned out, so it's fine. Now I know it's a bad decision." But that's not the way that decision making in life works, right? Because the issue is that life just doesn't fit very neatly on a chess board. It actually fits more naturally at a poker table. So we could think about this.

Annie Duke:    In chess, if the only thing that you know, the only piece of information that you have is that I played chess with somebody and I lost, what do you know about my decision making in comparison to theirs? You only know the outcome, I lost at the game of chess. But we actually do know something in that case about my decision making. It was worse than the person I played with. If I play poker against… Well, that's not fair because you guys know how I play poker. I'm not a random poker player. But these two guys play poker right here, right? And I tell you, you haven't seen them play. Right? And I just come over and I say… Austin, is that your name?

Speaker 5:    Austin.

Annie Duke:    That's so convenient because we're in Austin. I say, "They played for a half hour and Austin lost." And I ask you, "Do you know anything about who was the better player there?" No, not in poker, right? So what's going on? Because there's a difference between these two things. And when we engage in this resulting, we act like we're playing chess where it's totally fine to work backwards. But we know that in a game like poker it's not. And it doesn't work very well in life. So what's going on?

Annie Duke:    Well, we could think about… This is sort of a general definition of poker. It's a game of decision making under conditions of uncertainty over time. Let's not worry about the time part. But the uncertainty part, let's really worry about. There's two sources of uncertainty in poker that don't really exist in a game like chess, but the really exist in almost every decision that you make in your life. The two sources of uncertainty are, one, luck. Once Pete Carroll calls that play, that's it. Anything else that happens from his perspective is due to luck. Whether that ball gets intercepted or not, he has no control over.

Annie Duke:    This is true in poker. In poker, I could have aces and you could have fives. That means that you're going to win that hand 18.5% of the time. But you're going to win 18.5% and I have no control over when that 18.5% is going to occur. It just exists out there, and there's that influence because of the random deal of the cards. Chess doesn't have that same strong influence of luck. Nobody's rolling dice and then all of a sudden your bishop comes off the board. The pieces are going to stay where they are. That's source number one.

Annie Duke:    The second source of uncertainty is hidden information. I can't see my players cards. I don't know what they have. Pete Carroll doesn't know what defense Bill Belichick is going to call. He doesn't know what plays he has in his back pocket. He doesn't know how that's going to go. There's just stuff that he doesn't know. And in chess, that's just not really true, right? I can see my opponents whole position. I can see all of their pieces. And that means that if I'm considering a move, I know every single possible move that that player could make in response. And I know the pieces aren't randomly going to move around in some way that I didn't anticipate or I couldn't control. And then I'm going to figure out how I could move in response to any move that they could make, and so on and so forth, until if [inaudible 00:17:58] a big enough computer I can do that till the end of the game, and I can actually get to an answer for the game.

Annie Duke:    So I have on here, who's John von Neumann and why should you care? So I'll just ask, does anybody know who John von Neumann is? Any John von Neumanns? Yes. That's about average for the room. Congratulations. Okay, but I'm going to try to get us there. Does anybody know who John Nash is? Good. Does anybody know who Russell Crowe is? All right. Has anybody seen the movie A Beautiful Mind? So in A Beautiful Mind, Russell Crowe played an economist named John Nash. John Nash had schizophrenia, won a Nobel Laureate, won a Nobel Prize. The Nobel Prize was in a field called game theory. Game theory was actually… The fundamentals of game theory were put forth in a book called The Theory of Games by a guy named John von Neumann, along with somebody named Oskar Morgenstern.

Annie Duke:    Von Neumann's a super big person in the history of science. He's the father of the modern computer. He ran the Manhattan Project. He came up with the concept of mutually assured destruction, and in fact was the architect of our Cold War strategy. And on the side, he wrote this book, The Theory of Games, which was laying out the fundamentals of game theory. And if I gave you a definition of game theory it would be decision making under conditions of uncertainty over time. That seems awfully close to the definition of poker. That's not accidental because von Neumann actually based the mathematics on a simplified version of poker. Because he understood this very significant difference between chess and poker. He understood this resulting problem. And so poker actually gives us a really good way to think about human decision making, both in terms of the way that it goes wrong and the way that it might go right.

Annie Duke:    And we can take this idea farther by then saying, "Well, if we know that human decision making really acts a lot like poker, we can actually think about any decision that you make as a bet." So I think that people get hung up on this concept because they think about betting as like, "I go into a casino and I bet on red," and then they spin a roulette wheel and then let's see what happens. That's really just a very particular form of betting. Betting is whenever you're investing limited resources into an uncertain outcome. So when we think about what a bet is, it's like we've got some sort of limited resource. It could be time, it could be money, it could be good will, it could be anything like that. We've got a set of options under consideration. And any of those options is going to result not in a future that's determined but in a set of possible futures. And each outcome in that set has some probability attached to it.

Annie Duke:    So it's like a touchdown is 40%-ish. An interception is 2%. A dropped pass is a little bit less than 60%. And there you go, those are the possible outcomes that could occur, each with probabilities attached to them. And across that, you can figure out what the payoffs are so that you can figure out what the best possible future is going to occur over time. But you can't guarantee the particular outcome that's going to occur. And you have to figure out where you want to put your limited resources.

Annie Duke:    Now, a lot of times when we think about betting, we think that there has to be somebody on the other side of the bet. That's also not true. Because the one person that is always on the other side of the bet that you could think about to get past that idea is that you are always on the other side of the bet. Let me explain. There's a you that would result from taking option A and a you that would result from taking option B. And essentially, when you choose option A, you're betting against the you that take options B. So you go into a restaurant and you're like, "Should I have the chicken or the fish? I don't know." You decide on the chicken. It comes back. It's the worst chicken you've ever had. And we've all experienced that feeling. What do we say to ourselves? "I should've ordered the fish." Right? So that's future you who would've been having the fish saying, "You should've bet on me."
Annie Duke:    So really all of our decisions are predictions about what the best possible future is going to be, where we're investing limited resources. And that's the very definition of a bet. Now, you can see that at the base of this is our beliefs. Our beliefs determine what options we think are available to us, what we think the possible futures are. They determine what we think the probabilities of those futures are. They determine what we value, how we're trying to figure out which we think is the best option, right? I have beliefs about chicken and fish and the restaurant that I'm in, and so on and so forth, that drive the choice that I actually make. So we need to know that that belief portion is under control before we can really get to a space where we know that all of the rest of it, in terms of our ability to invest these resources properly, is going to be any good.

Annie Duke:    Okay, so this is where we really get into trouble with our beliefs. So here are some commonly-held beliefs that aren't true. You can tell me if you're surprised by any of these. Astronauts didn't eat astronaut ice cream. I was super sad about that. That was like my childhood dreams were dashed. It's too flaky, it would get into the mechanics. Also, let's be honest, it's disgusting. Abner Doubleday didn't invent baseball. I was really surprised by that. I mean did not just didn't invent it but had nothing to do with it. It's complete myth.

Annie Duke:    And here's an interesting one, immigrants' names were not changed at Ellis Island. How many people are surprised by that one? How many people, before you walked into the room, really had a belief that they were changed? Yeah, so me too until very recently. So this is an interesting one. So what actually happened was that immigrants, the immigrants themselves, occasionally changed their names in order to make them more Americanized. That happened in the history of my family. And sometimes there were misspellings. But no immigration agent just looked at you and said, "Your last name's Smith now." That did not happen. And yet, this is something that we walk around and believe. And believe me, there are many, many things like this that you believe. Now, these might not be high stakes. But there are lots of high-stakes beliefs that you hold that are of this type, that you believe them but you haven't necessarily vetted them in a way that would uncover the truth.

Annie Duke:    To get that idea across to you, is there anybody in the room who believed something really strongly when they were 20, and you now no longer believe it? Because for me, it's all of the things that I believed when I was 20. All right. And here are some thing, and some things that were pretty high stakes. So we shouldn't think that this moment in time is any different than the moment in time when you were 20, in terms of what the quality of our beliefs are. I want to just sort of dig into this a little bit and figure out how we've all been walking around with this Ellis Island belief. Because it's a little bit weird. Because it's very clearly and provably not true.

Annie Duke:    So we can think about how do we form new beliefs. And so here's the intuitive order. We hear something, we vet the information, deciding whether it's true or not, and then we form a belief about it. Does that sound intuitively right? So someone tells you something, you read an article, you come across a piece of information. You put some thought into it, some time into it. You think about it in terms of the other things that you know to be true. And then you form a belief. Great. So here's what actually happens. You hear things, you believe them, and then if you happen to have the time or motivation, you'll get around to vetting them.

Annie Duke:    So there's lots and lots of scientific work that shows that this is true. One of the great pieces of scientific work is from 1990 from Daniel Gilbert, who you might know from Stumbling on Happiness or otherwise the Prudential commercials. He's the Harvard scientist who puts the things up for the insurance company. But he did a really interesting study where he showed people statements of fact in a trial. And he had them marked in red and green. So green were the statements of fact that were true, and red were the statements of fact that were false. And he had them read it, but he distracted them after they read it. And when he distracted them so that they didn't have a lot of time for this processing step, one would assume that they would be random, they would believe some false things as true and some true things as false. That's not actually what happened. They just believed everything was true.

Annie Duke:    So we know that underlying, we believe things. And there's a couple of really strong pressures on us, in terms of our biology, that kind of forces that on us. The first is that if we think about most of human history, we didn't actually have language. So the only way that I could form a belief was by seeing something with my own eyes. So if I saw a tree, there it was, there wasn't a lot of reason to question the tree. One would assume that I was rarely hallucinating. So there's a tree, I believe it, done. Then something happens. We develop language. And I can now tell you things in the abstract. I can tell you about things that you yourself haven't experienced. I can say, "Hey, I was in this other place that you've never been and there's a tree over there." So what happens? Well, evolution is like a jerry rigger. It doesn't take your brain out of your head and say, "Well, now I have to form a completely new belief system." It says, "Oh, I've got this really great belief system about how we believe things about trees, so I guess this one is like that." So now, when I tell you about a tree, you act as if you have experienced it yourself in terms of the way that you form those beliefs. And so you just believe it and then you don't really get around to vetting it.

Annie Duke:    The second pressure is that we default to believe because it's actually good for our survival. So if I'm standing on the savanna and I hear rustling in the leaves and I take a skeptical stance toward the rustling and I say, "Well, I really need to run a controlled experiment to try to figure out whether there's a lion back there," my genes are dead. I don't get to pass those genes on. But if I just run away, I do. So defaulting to believe actually has an advantage in terms of survival.

Annie Duke:    And the third thing is this. Human beings are like super pathetic physically. We're not very fast. We're not very strong for our size. We don't have big claws. We don't have big fangs. But what we do have is big brains that can band together in groups in order to protect our territory and protect our genes. So we banded together in groups in order to compensate for kind of our physical weaknesses. And let's think about it, if people in your group, when they spoke to you, if you did not automatically believe what they were saying, what would be the point? Human discourse would literally break down. So within that tribe, we default to the believe the things we're told because it's good for the group. And that's good for the survival of our own genes. So we have all this pressure on us to believe things before we actually go through a vetting step.

Annie Duke:    Now, that wouldn't be so bad if we actually got around to the vetting step, which we don't do. So problem number one is that once we form these beliefs, when we do go out to sort of vet them in any way, we're very sensitive. We really notice stuff that confirms the belief that we have. So we notice all sorts of stuff that agrees with us. But then what goes along with that is that if we're presented with something that disagrees with that, we actually work really hard to discredit it. So the first is called confirmation bias. The second is disconfirmation bias. They're a pair of biases that go hand in hand in a very bad way.

Annie Duke:    So I'll tell you about my morning news routine. You can tell me if this sounds the same as yours. I'm going through and I see something, an article that agrees with my political opinions, and I'm like, "That's a great article. Yeah them." If I do bother to read something that disagrees with me, at the end of it I've written a dissertation about why they're wrong and what data they've missed and how they're so biased and why they literally don't know what they're talking about. Does that sound like anyone else's news routine? Because that's mine every morning. All right, so that's confirmation bias and disconfirmation bias working together. And notice, that makes it very hard for me to change the belief that I'm going into it with, right?

Annie Duke:    And then the third problem is that even when we find out that evidence is wrong,… Oh, we lost a little bit of this in the translation. That's okay. Even when we find evidence is wrong, our beliefs are still changed. So there's lots and lots of studies that show this, but here's one. I take half the room and I give you a scientific study that shows that firefighters are really risk seeking. You could see why that is. I mean how'd they become firefighters in the first place, right? So I give you a bunch of data tables that show you that firefighters like risk. I take this half of the room and I give you the exact same data but I change the labeling on the data, and it shows that firefighters really are very risk adverse in their real lives. You could see why that is. Because they run into burning buildings, maybe they're not taking so many chances elsewhere. All right, great. So I give you guys that evidence. Now I give you all a survey about firefighters and risk. You all do the rational thing and you say, "Yes, firefighters really like risk." You all do the rational thing and you say, "Yes, firefighters really don't like risk in their real life." And we're all good, right?
Annie Duke:    So now, after I do that, I come into you and I say, "Oh, I made it up, by the way. I was just seeing how good you were at reading data. But it wasn't true. It wasn't a real scientific study that happened. I made it up." Okay, so now I've given you the corrective information. You guys all go away. And now you come back a week later and I give you the survey, and you all think that firefighters really like risk, and you all think that firefighters really hate risk. So once a belief is lodged, it's super sticky.

Annie Duke:    So here's the problem. This is really a pattern called motivated reasoning. And this is what we can think about it as. The reason why we think the intuitive order is I hear something, and then I spend all this time and I think about it before I form a belief, is that we think that information is in the driver's seat. We think that what happens is the information comes in and we think about it and we vet it and we do all of this stuff, and then that's informing our beliefs in this very rational process. But what's actually true is that beliefs are in the driver's seat. What's actually true is that you believe certain things and your beliefs are woven into your identity. They are literally the thread that forms the cloth that is your identity. And you do not want that to tear. So once that belief is lodged… And it's not just political beliefs. It's beliefs about what you think is true of the world. In business, what your business strategies are. Just a tactic that you think is a really good tactic walking into a negotiation. Trust me, the thing that you believe is part of your identity.

Annie Duke:    And that now drives the way that you process information with the sole purpose of reinforcing the belief that you already have. This is called motivated reasoning. I would see it at the poker table all the time. People would have these strategies that were super clearly not good. And they would figure out a way to go around it. "The other guy got lucky, that's why I lost that hand." "Really? The last 20 times?" But this is how strong it is. Because it's an assault on their identity if they find out that the strategy that they used wasn't particularly good. And in order to protect it, they now are processing the feedback that's out at the table in a way that totally distorts their ability to learn.

Annie Duke:    And what's interesting about that is that this isn't good for the future version of you. Because I assume the future version of you would like your beliefs to be more accurate. Because if your beliefs are more accurate, it drives this whole process about what decisions you make in a more accurate way. But the problem is that there's an empathy gap between that future version of you, that you would like to be more accurate, and the present version of you who doesn't want to be wrong because it feels like an assault on your identity. And guess who wins in that battle every single time?

Annie Duke:    So this was the first thing that made me realize, "This is why poker players aren't winning," because of motivated reasoning. Because they're processing everything around here in a way to affirm their identity. And their identity is, "I know what I'm doing. My strategies are good. I have the right idea about this game. And anything that happens that disconfirms that, I'm going to swat away because I don't want my identity assaulted."

Annie Duke:    And so we can see this… Well, this is just Simon and Garfunkel I think says this pretty well, "Still a man hears what he wants to hear and disregards the rest." Okay, so you might be asking like, "Okay, this is great." You're saying, "Annie's told me about this." You guys are all smart people so you're cured. You're not going to do this anymore. No. So here's the problem. Surprisingly, being smart actually makes this worse. Why is that? Well, because smart people are really good at telling data stories. Right? We're really good at spinning a tale. And the thing is, none of this stuff is conscious. We all, our main job as communicators is PR for our own beliefs. We don't know we're doing it but that's what we're doing. Who do you want spinning for you? A person who's super smart or one who's not super smart? Well, I want the smart person to do the spin for me, right? So here's a couple of great pieces of evidence that show that being smart makes it worse.

Annie Duke:    Dan Kahan over at Yale did a wonderful study where he looked at how people were at figuring out the relationship between whether a particular skin cream cured psoriasis. He gave them some complicated data tables in order to try to figure that out. Nobody in the room had any strong beliefs about this skin cream, and so you were in a pretty neutral setting. And now he just figures it out and he divides them up. "You guys were really good at figuring out the relationship between skin cream and psoriasis. You guys were not so good at it." Now, he gives the exact same data tables back. But now he changes the labels to something that he knows that people have a strong belief about, the relationship between gun control and crime. And now the question is for all of these people who were very, very good at reading these data tables when they had to parse the relationship between a skin cream and psoriasis, how are they doing once it's gun control and crime? And for half of you, the data table is going to agree with your views. It's going to come to a conclusion that you like. And for half of you, it's going to come to a conclusion you don't like. You fall apart. You read the data table as agreeing with you whether the data actually agrees with you or not.

Annie Duke:    And here's the bad news, it's worse for you than the people who weren't so good with it. You're better at looking at that data and finding a way to spin that toward your belief than people who weren't so good with the data in the first place. So the fact that you happen to be really good at that stuff does not protect you. It actually makes it worse for yourself. Why? Well, data is not truth. You need a human being to interpret the data. You need a human being to collect it. And smart human beings are just better at slicing and dicing data to tell their own story. They find the right data story.

Annie Duke:    And then here's the other problem, which is that smart people are more confident. They have more confidence in their intuitive response. So the people who kind of know they're not so good with data are maybe digging in a little bit more and asking a few questions. You guys, you're taking a glace, saying, "Oh, yeah, I know what this says," and you're not looking farther because you have a lot of confidence in your intuitive response to that. And so you're much less likely to question what your gut is telling you. But what is your gut telling you? Your gut is telling you that the beliefs that you already have are true. That's bad. And we need to be challenging that more, and we don't when we actually are more cognitively agile.
Annie Duke:    Okay, so that's been a lot of bad news. Let me try to get a little good news in here. So if we can actually go back to this idea of betting, of any decision that we make is a bet, to start to get a view into maybe what part of the solution might be. So if I had… Someone who believed the Ellis Island thing, just anybody, raise your hand who believed. Okay, so over here. All right, so let me ask you a question. If I had said to you about this Ellis Island belief, "How much of your net worth do you want to bet on that?," what would the answer be?

Speaker 9:    10%.

Annie Duke:    10%. So it's not 100%, right? So that's good. So that's going to give a view into how certain you are of that belief, right? And so what I discover is that while you would've said, "Yes, that's true," that it turns out that you only think it's 10% true. Right? So it sort of exposes the fact that maybe this isn't black or white, yes or no, but that your knowledge about this might not be complete. Now, when I say, "Do you want to bet?," tell me if any of these questions go through your head. Where and when did I form the belief?

Speaker 9:    When I looked at my grandfather's immigration papers and it went from Johannesen to Johnson.

Annie Duke:    Right. But did you ask him how that happened?

Speaker 9:    No.

Annie Duke:    No. Exactly. So now you're going to be like, "Oh, wait a minute, yeah." So where and when did I form the belief? What evidence do I have for it? How reliable is that evidence? Here's one, what is the person challenging me to the bet know that I don't know? Right? I'm not a random person and I just challenged you to the bet. Maybe I know something. What are the reasons that the belief might not be true? That's an interesting one, right? So we can think about this, and basically… Remember, I talked about that empathy gap between the future version of you and the present you. And part of that empathy gap is that we don't feel the consequences to future us. It doesn't feel like anything that is really bad going to happen to us. When you're trying to eat healthy and you grab that humongous muffin in the morning, it's because the person who's going to experience the consequences of that muffin eating is way in the future, and we don't have a lot of empathy for that person. But by saying to you, "Do you want to bet?," I'm reminding you that your beliefs have consequences. I'm giving you some stakes to be thinking about that cause you to run through these questions. Okay?

Annie Duke:    And what we know, if we go back, is that one of the big problems, one…

Watch the Video "2019 BankOnPurpose Recap"

2019 BankOnPurpose Recap

Watch Now