Episode 006:

Numbers and Numbing

with Dr. Paul Slovic

March 24th, 2021

Listen or subscribe to our podcast on your favorite podcast provider!

Episode description

Can we count on our feelings to guide our decisions? Dr. Paul Slovic, Professor of Psychology at the University of Oregon, joins your host, Dr. Joe Sweeney, Executive Director of the Alliance for Decision Education, to discuss the three pillars of the arithmetic of compassion, how our feelings change as we consider one vs. dozens of people in danger, and how flawed thinking could have impacted decisions about the use of nuclear weapons during the Cold War. You’ll also learn about the “warm glow of satisfaction” we get from helping people, why it shines brighter in some situations than others, and why, in certain contexts, we might not help someone in need, even if we can.

To listen to more episodes, visit The Decision Education Podcast homepage.

 

Dr. Paul Slovic received his B.A. degree from Stanford University, and his M.A. and Ph.D. degrees in psychology from the University of Michigan. In 1976, Dr. Slovic founded the research institute Decision Research with Sarah Lichtenstein and Baruch Fischhoff, where he currently serves as President. He has been a professor of psychology at the University of Oregon since 1986. He and his colleagues worldwide have developed methods to describe risk perceptions and measure their impacts on individuals, industry, and society. His most recent work examines “psychic numbing” and the failure to respond to mass human tragedies. He publishes extensively and serves as a consultant to industry and government.

Dr. Slovic is a past President of the Society for Risk Analysis and in 1991 received its Distinguished Contribution Award. In 1993 he received the Distinguished Scientific Contribution Award from the American Psychological Association. In 1995 he received the Outstanding Contribution to Science Award from the Oregon Academy of Science. He has received honorary doctorates from the Stockholm School of Economics (1996) and the University of East Anglia (2005). Dr. Slovic was elected to the American Academy of Arts and Sciences in 2015 and the National Academy of Sciences in 2016.

Joe: I’m excited to welcome our guest today, Dr. Paul Slovic. Paul is Professor of Psychology at the University of Oregon. He is the co-founder and current president of Decision Research, a trailblazing research institute, working at the cutting edge of the decision sciences. Paul’s research focuses on human judgment, decision-making, and risk analysis, including work examining psychic numbing and the failure to respond to mass human tragedies. In 2016, Paul was elected to the National Academy of Sciences. One of the highest honors that scientists can receive. Paul is also on the Advisory Council here at the Alliance for Decision Education. I met Paul while visiting the Decision Research Institute as a guest of his colleague and leader in the field of decision education, Dr. Robin Gregory. Welcome Paul, and thank you for talking with us today.

Paul: Thank you, Joe. Glad to be talking with you.

Joe: I was wondering if you could just tell us in your own words a little about what you do and your path to getting here?

Paul: Well, what do I do? I shuffle a lot of paper. I know that. I’ve been studying the psychology of risk and decision-making since 1959, believe it or not, so that’s more than 60 years. I was lured into it by a professor who I was given a job working for, Clyde Coombs was his name. And he was studying choices among gambles. And I got hooked on that. I thought this was a fun thing for a psychologist to be studying. And 60 years later, I still study people’s decisions about gambles. They are no longer simple, two outcome abstract gambles in a laboratory, but they’re now the gambles that we take in everyday life, all around us. The things we’re familiar with create risk in our lives, health, safety environment, government. Decisions everywhere that involve risk.

Joe: So one of the treats that I get as the host is to read ahead all the various things that our guests have written and yours was just a blast. I mean, there’s so much that you’ve thought about and written about. I thought what we could do is we could talk about some of the research topics and then we could talk about some of the applications that you could point us at as far as ways that we might utilize what you’ve discovered or have co-discovered to improve our own decision making for ourselves, our organizations, and our society. The first one that I wondered if you could talk about – I don’t think most people will have heard of before – is the arithmetic of compassion. What are you talking about there? What is it?

Paul: Right. So I got interested in the problem of genocide in the world in the 1990s. I was a child during WWII and the Holocaust was taking place and we didn’t know much about the Holocaust until probably a decade or so after the war. [In] the 1950s, when the stories started to come out, in graphic ways and the world was shocked. As a teenager then it made a big impression on me and then I didn’t think about that for a long time. And then I worked with Daniel Kahneman and Amos Tversky who among many other important contributions, developed something called prospect theory. And the centerpiece of prospect theory is something that they call the value function. How do we value things like amounts of money or numbers of lives at risk, as those quantities increased? And what they showed was this function wasn’t a straight line. It was kind of a curved thing. The biggest effect on our valuation is when we go from zero to one, from no lives at risk to one life. One life at risk is very important to us and people will risk their own lives to save a single person nearby who was in danger, so that life is very valuable. But then if there are two lives at risk, the value function is not straight. It’s not linear. It starts to curve over. And two isn’t twice as big as one. As the numbers increase, it starts to get flatter and flatter.
It wasn’t until afterwards that I figured out, well, why is that? And the problem is with our feelings. So what Daniel Kahenman has brilliantly described in his book, Thinking, Fast and Slow is that most of the time we rely on our feelings, what he calls system 1, intuitive feelings to think about the world, as opposed to making scientific calculations, which we’re capable of doing, but is a very difficult and it’s hard work.

And we usually don’t do that. We take the easy way out and go with our feelings because it feels right to do that. It’s easy. It’s natural. And it usually works for us until it doesn’t. So one of the problems with feelings is that our feelings can’t count. They’re innumerate. And here’s where we get back to the value function. So Kanheman and Tversky showed a beautiful picture of this function, which starts off very steep and then starts to flatten out. But they didn’t really talk in detail about why that was. So that’s what I got interested in and started to link that to the fact that this happens when we rely on our feelings. This curved function is the way our feelings change when we’re thinking about greater and greater quantities of money or greater and greater numbers of lives.


“One of the problems with feelings is that our feelings can’t count; they’re innumerate.”


I can illustrate that very simply. So the difference between zero lives at risk and one is huge, as I mentioned earlier.  And [having] two people at risk doesn’t feel twice as concerning as one. We already are pretty concerned with one. And two maybe we’re more concerned, but not twice as [much]. And then let’s suppose that I say, okay, now there are 87 people at risk. And then I say, oh, wait a minute. I made a mistake. There’s 88 people. It’s not 87, it’s 88. You won’t feel any different thinking about 88 lives at risk than 87, even though there’s an additional valuable life there. It shows the inability of the feeling system to differentiate quantity as the numbers increase, the same thing with money.  If you find a hundred dollars, it will make you happy. If you see a hundred dollars on the street. If you find 200, you won’t feel twice as happy.

Joe: I was just going to ask if it goes for the positive, as well as the risk and is it the same thing?

Paul: Positive and negative. And actually this way of reacting is more general than just valuing things like money and lives. It’s also the way our sensory system responds to changes, like our visual system responds to changes in the brightness of a light or our auditory system responds to changes in the sound energy, the loudness of a sound. Well, in a quiet room, you can hear a whisper. So you go from no sound to a quiet sound. You can hear that whisper. In a loud room, you’re not going to hear that whisper. It takes a lot more change in the volume of the sound before you’ll notice it in a loud room. It’s called the just noticeable difference. Well, psychologists were studying this in Germany in the 1880s, they, you know, it was called the field of psychophysics. And so the early studies of psychophysics had to do with sensory changes. But what Kahneman and Tversky showed was that psychophysics also works on things that are social values, not just sensory value things, but social values follow that same kind of function.

Joe: I remember reading Don Hoffman’s book, Visual Intelligence. I don’t know if you came across it. Wonderful descriptions of our brain as a difference engine and how we find the edges of surfaces and shapes. And what you’re saying with regard to our perception system being tied to it. That makes so much sense to me. Now, I was wondering, as I was reading your work, why this would be happening, but you’re grounding and in the very way that our brain responds to differences and whether we can notice them or not.

Paul: Yes, it’s absolutely built into our brains. And it’s interesting to think, well, through evolution, why would we evolve to have this kind of response? Because we become insensitive to the large, because that’s not adaptive. Well, think about a system like the mechanisms in your ear that transmit sound well. If you want to design a system that is sensitive to very low quiet sounds that you may need to hear in order to survive. You have these, these systems that are making that happen. Well now supposing that same system: to make it equally sensitive to the large, you couldn’t do it physically.

Joe: Right, you’d melt the brain if you’ve sent that kind of electrical signal.

Paul: [You’d] blow things up. And the same thing with the visual system. So, nature decided it’s better to get us sensitive to the small than the large and build in mechanisms in our eyes and ears that enabled us to react to the small things. And probably the same thing with regard to valuing lives. When all of this was evolving at first, the world was smaller. Our world was right in front of us. Protect yourself. [There were] few people around you. You didn’t have to think about protecting thousands or millions of people on the other side of the earth that just didn’t exist.

Joe: It’s interesting you’re reminding me of something Barbara Mellers was telling me about. If she could say three things that we should know about decision-making one is that there are better ways or good ways to make a decision. Two, there are these predictable ways we go awry. And three, try to construct context for yourself where you’re more likely to go in the first direction rather than the second. And I’m listening to what you’re saying and thinking about that and just wondering if our system, our biology is telling us that the 88th life is worth more, but  it’s not equal to a whole nother person, what is it that’s driving you the other way? What did you learn about either ethics or about some other aspect of, of social science that makes you say no, we’ve got to slow down here and when we make our determination about what policies to pursue with regard to health care or anything – and we’ve got to treat that 88th life as another unique life. We can’t just let our intuition and our system 1 thinking drive the decision here. What are you grounding it in if not our biology and our systems?

Paul: I think it’s because over a long period of time, our brain evolved and became capable of abstract and conceptual thinking, ethical thinking. We can appreciate the fact that we learned how to and we can subtract 87 from 88 and see there’s one additional life. And that there’s no reason to devalue that life just because there’s other people at risk. And the question is if a life is valuable, why does it become less valuable if there are other people who are also at risk? And we can do that kind of thinking which Kahneman called slow thinking and that’s linked to all kinds of ethics and morality and this has become the basis of fields like risk assessment and risk analysis, technical fields where people do this slow quantitative thinking to assess values and to guide decisions, so we can do that. The problem is we tend not to, because we think we can do the job as well with our gut and it’s a lot easier. And so we go with our gut and as I say, most of the time that may work for us and other times it really causes problems. And that’s what those of us who study this are trying to do. We’re trying to figure out when can we trust our gut feelings? When do we need to slow down and do careful analysis or listen to others who are doing careful analysis? That’s the key factor now.


“We think we can do the job as well with our gut and it’s a lot easier. And so we go with our gut and, as I say, most of the time that may work for us and other times it really causes problems.”


Joe: Yeah. I remember reading about this and listening to people talk about it and intellectually agreeing with it or assenting to it. And then being very disappointed in myself that it still doesn’t feel like I need to. That the feelings didn’t update right away to the new understanding, like, okay, I’m going to use calculations. I’m going to evaluate risk. I’m going to think about expected value and utility. I began learning about those things and found that I could do it and if I did what you suggest, which is pause and deploy that system 2 thinking, but it didn’t immediately change how I felt. And I don’t know if you’ve talked with any decision makers over the years about that, or through your research looked into does it eventually shift over? Does the feeling start to shift or is this just an ongoing self management problem?

Paul: Well, I think some people are trained to think analytically. I think economists are more analytical than people in other disciplines. This is the way they think. Or people in law, lawyers, judges, this is their world is that of argument, maybe not quantitative calculations, but they use logic, reason, arguments, and that’s the way they think. So I think we can train ourselves to be more careful and more analytic, but it’s difficult. And in some cases we may not adequately achieve the right balance between analysis and feeling. And in that case, then I think you have to say, well, okay, if we’re thinking in ways that we find hard to justify, maybe we have to change the situation. Maybe we have to restructure the decision setting in ways that we’ll recognize that we need to be prodded or pushed, or in today’s world, we use the word nudged, to do the right thing or maybe certain types of judgments and decisions need to be taken out of our direct control. That we may need to do the calculations carefully and then build those calculations into the system and take that away from our gut feelings, because we can’t trust those feelings. Or maybe we should turn them over to machines. People always say, well, machines get faster and stronger. They’re faster than we are. They have better memories. They’re stronger than humans, but they can never replace the human sense of morality, for example. And I said, no, wait a minute, because what we find in this trusting of the feelings is that our feelings are often not moral. I think it’s not moral to devalue life just because other people are at risk.

I think that’s a mistake and if you have a decision-making system that’s controlled by machines, like maybe some form of automated transportation systems and you want to put value, you wouldn’t put a value function into the machine in a way that devalues lives when there are more at risk, you would make it kind of linear. You would have to make a decision. One decision might be well, if you believe every human life is intrinsically of equal value, then you want this machine to react proportionately more strongly as the number of lives increase. You want to build on what we’d call a straight line. It’s just adding or multiplying. Again this is a very long answer to your question about the arithmetic of compassion.

Joe: It was. And I wonder if we drifted over. Did we drift into psychic numbing, which is a whole other area of yours? Or would you treat that distinctly?

Paul: No, numbing is part of the arithmetic of compassion. Compassion is a feeling and, as I said, our feelings are innumerate. They do arithmetic, but they do it wrong. One plus one does not equal two; it’s in something less than two. And in some cases it’s less than one. So one thing that was not built into the value function that Kahneman and Tversky put forth, which keeps increasing, but gets flatter and flatter, is what I would call the collapse of compassion. That not only did we become insensitive, we don’t see the difference between 88 lives and 87. We actually feel less concerned. We have less feeling for the larger numbers than for the smaller numbers. And when the numbers get really big, they’re just numbers. You know, we say that you don’t know statistics are human beings with the tears dried off. There’s no tears to them. These are emotionless numbers that don’t register. We don’t appreciate the reality that these numbers represent because a main element of meaning for information is the feelings that that information creates in you. And these big numbers are creating no feelings, they’re just numbers.And so we are lacking the understanding of what those numbers mean.

Joe: You also talk about the false sense of powerlessness that we can get with very large numbers or with statistics. So one child starving, we feel like we can do something. I remember you talking about an experiment or an evaluation there. Can you say some more about that?

Paul: There are three pillars to the arithmetic of compassion. And the first is psychic numbing. This insensitivity as the numbers increase which is again a peculiar kind of arithmetic and we say that the more who die the less we care. That’s numbing. And I think it’s irrational. The second thing we learned in studies of when people help others and when they don’t is that we help others first because they need our help and second, we feel good about helping them. Economists call this the warm glow of satisfaction that we get when you do something good for other people. It’s a motivator. And what we learned through our experiments is that if you have the ability to help someone, but your attention is drawn to others that unfortunately you cannot help, you’re less likely to help that person, because it doesn’t feel as good. In other words, you could still have the same ability to help them, but it doesn’t feel as good to help them because the bad feelings that you get when you think about those you’re not helping enter your brain and mingle with the good feeling that you have about helping and devalue it. It’s like the brain averages inappropriately averages in irrelevant information that is carrying counter feelings. And so it doesn’t feel as good to help this person anymore. And you’re less likely to help them, even though you still can. So this is again, a form of arithmetic [where] irrelevant numbers from outside, come in and devalue the feelings for the relevant numbers. There is an arithmetic aspect of that. So that’s why we keep that as part of the deadly arithmetic of compassion. So numbing and is a false sense of efficacy, which we give a kind of a jargony name, pseudoinefficacy. It makes you feel that you’re not effective when you really are. So it’s a weird word, but anyway, so numbing and pseudoinefficacy are problems with our feelings.


“What we learned through our experiments is that if you have the ability to help someone, but your attention is drawn to others that unfortunately you cannot help, you’re less likely to help that person, because it doesn’t feel as good.”


The third pillar of arithmetic of compassion is something we call the prominence effect, which results from slow thinking. The numbing and inefficacy happens when we think quickly with our feelings, because if you’re thinking slowly, then you know you shouldn’t not help this person just because you can’t help others. Slow thinking hopefully would not succumb to that. But fast thinking does. Now the third pillar, which is what we call the prominence effect is a slow thinking problem. The first study, I call the more important dimension effect. And then we changed that to the prominence effect. That some qualities of a decision are more prominent, which means it’s hard to make tradeoffs. They dominate every trade off. And then a few years later, when I started to think about genocide, I realized this was an explanation for why we turned a blind eye to genocide over and over and over again. After the Holocaust, we swore that this was so horrific that – you hear the phrase never again would we allow this to happen. And if you look, a half century later, you see that over and over again, dozens of times there were nothing like the Holocaust, it’s a unique event, but there were genocides and other mass atrocities happened all the time. And as we speak, they’re happening in multiple places around the world, and we decry these, we say, this is horrible. This is wrong. We say that we value the lives that are being abused, but we don’t do anything. We don’t act on our values. And so I’ve been arguing that one of the key reasons we don’t act in a way that’s consistent with our values for human life is because often, typically action carries some costs that threatens our own security.

It is dangerous, costly to go into a sovereign nation that’s murdering as citizens for political reasons and when it comes down to finally making that choice, our own security is prominent over the lives of other people who were probably numb to it. I mean, we know that there’s a lot of people being killed, but they are statistics, so that again brings in the psychic numbing. But basically when we come to have to choose, we go with our security and we fail to act. We turn a blind eye to it and protect ourselves. We don’t want to risk our military. We don’t let us spend a lot of money. We don’t want to anger some allies that we’re trying to work with politically who may support the regime that’s killing its citizens.

All of these things are security related costs, which are prominent. And it’s not that we’re kind of averaging these things out on a balance scale. It’s not a balanced scale. It is an all or none, almost an all or none kind of thing that if we go for security, it doesn’t matter how many lives are lost because of it, whether it’s a thousand, a million, a hundred million. And so that’s the third pillar of the flawed arithmetic of compassion that devalues lives when protecting those lives conflicts with our security.

Joe: Okay. So I think I’ve got it now. So there’s psychic numbing. There’s the pseudoinefficacy and the prominence effect. Right. And is it a little too strong for me to say this, but what it sounds to me like what I’m hearing and you’re saying is that these things lead us astray to the point that they are making us pursue unethical policies or decisions that we actually behave in a way that we know is unethical.

Paul: Yes. But we probably don’t feel that we are doing something unethical when we’re doing it. And that relates to another, another concept. Let me just take another step further beyond genocide. A couple of years ago, I was asked to take a look at the nuclear weapons, and I started to think about war and go back and read up on the history.

Daniel Ellsberg, famous from the Pentagon papers. He was a young analyst in the 1960s. He was trained in decision-making operations research and decision-making at Harvard. He got a job with the RAND corporation. He was privy to being sitting in the room when the people who are doing nuclear weapons strategies were planning what to do with these nuclear weapons after the end of World War II and we were building up our weapons. And he saw the plans now that the Germans and the Japanese were no longer our enemies, but the Soviet Union was the evil empire. And we were worried that they wanted to control the world. And we had nuclear weapons aimed at cities in the Soviet Union and China at 600 million people. And he saw the numbers, he saw that those in control of nuclear weapons had targeted hundreds of millions of people for death, and again, I thought, okay, to protect our security. Again, this is security prominence; it’s psychic numbing. What is 600 million? And not only that, it could be more than that because it didn’t take into account nuclear winter, or the fact that it might end civilization to start a nuclear war. So all of this is an insensitivity to quantity, to psychic numbing, for the sake of security that is even more evident in warfare than it was in genocide and other humanitarian abuses.

Joe: So that’s both horrifying and just sort of stunning. And I appreciate the work you did there. I wonder if we could bring it forward a little bit more to the current situation with the pandemic and ask, does this framework have any lessons for us about what we should do to structure decision-making for ourselves, our organizations, schools, businesses, or our society, with regard to COVID and how we should think about it or how we should feel about it and if those two things aren’t the same?

Paul: Absolutely you see the same phenomenon, the same cognitive limitations and flawed arithmetic of compassion at work with COVID as in these other contexts and a few other cognitive issues that hopefully could be addressed with education and awareness. So clearly as the numbers of lives, numbers of cases, number of deaths increases, again, we see that we have become numb to these statistics. Except when we hit a new level, like if we hit 200,000. There are some numbers that are special and we pay attention to those numbers, a hundred thousand, now it’s 200,000, now it’s 300,000. It’s closing in on 400,000. When it hits 400,000, there’ll be a little extra attention given to this, but we were not going to feel any different. We don’t feel any different from 300,000 to 400,000. It doesn’t seem to make that much difference. So there’s a form of numbing. Gut feeling based minds don’t understand exponential growth. We vastly underestimate where it’s leading and how fast it’s going to be before we’re overwhelmed by it. We need to do the math, the slow thinking, we know how to do these projections mathematically, and our experts can do that. We need to listen to our experts and not to our own gut feelings here. So that’s a very important lesson. I think when we’re facing cognitive biases of all kinds, we need to pay attention to people who have thought this out slowly and carefully and listen to their advice.

Joe: I hadn’t thought about that before; that they’re doing the slow thinking for us.  It’s like outsourcing our system 2 if we trust experts.

Paul: Yes.

Joe: I have some questions that are a little more tied to the work that we’re doing with the Alliance. So for example, I’m wondering with the things that you know now about the arithmetic of compassion and decision-making in general, what are some of the key decision skills that you’d like to see young people learning in school?

Paul: I think that at an early age, we need to impress on youth the importance of trying to think carefully about problems, not to just go with their first, fast reaction to something. Think carefully, listen to others, respect others, respect other viewpoints. I think we should teach kids to kind of respect the importance of decision-making as a thoughtful enterprise.

It’s amazing how even at the top level of government, in the most important decisions that our government officials or military can make, like whether or not to use a nuclear weapon against an enemy, this decision is given to the president of the United States, who could act with sole authority on that – we don’t give the president any training in decision-making about this. This is the most important decision that a human being could ever make. And we don’t sit them down and try to help them understand how to think about this decision, how to structure this decision, how to make sure you’ve got the best set of alternatives to choose among. That’s the first thing. Think about: what are your objectives in this decision? What are your values that these objectives are supposed to be fulfilling and so forth? And then think about how do you put this together? In a careful decision on which the fate of civilization could depend. We don’t do that. There’s no respect. It’s like, okay, you’re the new president. Okay. Yeah. Here’s how many nuclear weapons we have, it’s thousands, you’re in charge and we’ll give you intelligence about what’s going on, the usual and good luck.


“I think we should teach kids to kind of respect the importance of decision-making as a thoughtful enterprise.”


Joe: I agree with you. I think it’s very strange that we haven’t thought about some of these very important decisions and how to prepare people to make them in specific ways. I’m wondering, let’s pretend it works. I plan that it will, but assume I should say, not pretend, assume that we’ve succeeded and Decision Education really is something that every young person is empowered with or is taught the skills and dispositions of, what do you think would look different in our society when we succeed in this mission?

Paul: I think that we would have a society that is more harmonious, where there’s more equality. Because we would realize that to have a society where we have increasing levels of inequity is a recipe for disaster. Not only does it lead to a lot of people who are very unsuccessful, unhappy, miserable, unhealthy, but it also will lead to violent conflict. We would reduce that kind of inequality that leads to divisiveness and violence.

I think that people would be healthier, they would make better choices and decisions that would affect their health. They would make better decisions about how to manage their finances, their money. I think that they would make better decisions relevant to their and their safety. And they would also demand that authorities in charge of these various components of our life, health and environment and security, make better decisions. So I think it would be basically better governments, which then would affect us in all kinds of better ways. I think this is how central decision-making is to the wellbeing of our society.

Joe: So those were the easy questions. Now I’ve got two last questions that are much more challenging. If you only got to pass down one tool or skill or lesson to the next generation of decision makers, what would it be?

Paul: I would say it’s critical thinking. Now, my sense of it, or definition, is a tool that is what Decision Education is about: to think critically.

Joe: Okay. So that was the first hard one. The second one is – this podcast is actually for adults who are interested in this whole area who didn’t get the benefit of this in school. What’s a book that you would recommend for our listeners who are keen to improve their own decision-making?

Paul: Again, this is a, this is a tough one, but I would say Daniel Kahneman’s Thinking, Fast and Slow. The book has sold 15 to 20 million copies, so people are buying it. And now Kahneman has said he thinks fewer than 20% have read more than a fraction of the book. It’s a book that you can skim. You can go back and forth, but this has been bought and read and extolled as valuable by people in all walks of life, all kinds of disciplines. It’s been amazing how people have picked this up. It was published in 2011. It was now almost 10 years out. And even today new people are discovering it and raving about it, so there must be something good in that. And it’s all about what we’re talking about, about the ways in which fast and slow thinking either help us or harm us. And so sure you can’t understand maybe everything in it, but it was attempted to be written  for the broader audience. It’s still not a simple read, but it’s got a lot of valuable insights in it.

Joe: So we have a little internal bet about whether any guest on the podcast when we ask that question is automatically going to go with that book. I agree. It’s a fantastic read. And it’s one that you can take time to go back in and steep yourself in the ideas. They just keep on improving the way that you think and make decisions.

Paul: My own book, The Feeling of Risk is if you’re interested in understanding how the mind works when we’re facing risk, which we often do, then that book is a good introduction.

Joe: Does the author know what he’s talking about?

Paul: I’m a little biased.

Joe: All right. Fair enough.

Paul: Well, the other thing that it would be great, and maybe you have linked to it – I think we’ve linked to you – is the link to the Arithmetic of Compassion website.

Joe: Yes. Right. I was going to ask actually, if listeners want to learn more about your work, as opposed to decision making, just in general, can they follow you on social media? Should they go to a particular website? Where would you like them to go first?

Paul: I would go to the Arithmetic of Compassion or to DecisionResearch.org and to look me up there.

Joe: And I’ll put in a little plug. For our audience, I got a chance to hear you talking particularly about nuclear decision-making and it was fascinating. Paul, thank you so much for coming on our podcast. Really appreciate it. And I feel like I learned a lot. I know it’s an interesting topic that most of us never heard about in school, and it clearly is important to how we make decisions as a society. So thank you very much.

Paul: Thanks. Thank you, Joe. Thanks for what you’re doing with the Alliance for improving our decision-making. Very important.

Joe: Well, thanks for helping.

Share this episode to your favorite platform!

Check out our other latest episodes

  • Episode 029:

    Changing Minds in a Polarized World

    with David McRaney

    Why do people sometimes become more entrenched in their beliefs when they are challenged? In this episode, David McRaney, science journalist and creator of the [...]

  • Episode 028:

    Rethinking the Workplace

    with Dr. Adam Grant

    Can giving advice actually be more valuable than receiving it? In this episode, Dr. Adam Grant, organizational psychologist and world-renowned author, joins host Annie Duke, [...]

Stay informed and join our mailing list