Transcript
[00:00:00]: David: Welcome to Outrage Overload. A science podcast about outrage and lowering the temperature. This is episode 19.
****: Scams in scamming is rampant
[00:01:00]: News Clip: this year. Scams are at an all time high and experts say they’re more sophisticated than ever. Tonight we have a consumer 10 alert for you. The Better Business Bureau looked at scams over the past seven years, and they say online scams have risen by almost 90%. Lin Fa Chan lost $200,000 after falling for a cryptocurrency scam.
****: He says, the scammers impersonated, traitors, and were convincing. And he’s not alone. He’s one of many who fell for a scam online. As it turns out, these scams are everywhere, everywhere. And
****: we thought
****: he wasn’t there enough with those pesky robocalls. And now you’ve got scams from covid treatment frauds to covid cure scams.
****: Americas are getting duped and losing big. You know, it used to be maybe you’d get the email from the Nigerian princes and like they were these days, a lot trickier, Carly, and a lot of these folks are going on apps that younger
****: David: people are using. Some of the same psychological factors that make us susceptible to scams also make us susceptible to outrage
****: Chris Chabris: porn.
[00:02:00]: I think you’re absolutely right. I, I, I hope we can, we can talk about how, you know, provoking outrage is, is sort of a. Not that different, I think from trying to scam people, as you said, or trying to calm them or you know, something like that. I think it actually has a lot, there’s a lot in common that’s not obvious of like first thought, and
****: David: that’s what we’re gonna talk about on this episode of the Outrage Overload Podcast.
****: I’m your host, David Meyer, and today we’re talking about scams, why we fall for them and what we can do about it. You might remember the internet sensation that was the invisible gorilla, a captivating experiment that became a viral meme and forever changed the way we see our own perceptions.
****: Dan Simons: I’m Dan Simons.
****: I’m a professor in the psychology department at the University of Illinois, and I also have an appointment at the Beckman Institute for Advanced Science and Technology. Uh, my research here focuses on visual cognition, what we see, what we don’t see, how much of our visual world we’re aware of, how much we remember from one moment to the next.
[00:03:00]: One of my best known studies, uh, was done with my collaborator Chris Chari, some years ago. And what we had was a video in which people were passing basketballs. So three people were wearing white shirts and they were passing a ball. And your task at when you were watching the video was just to count how many times those three players pass the ball.
****: We also had three players wearing black shirts, passing their own ball, and you were supposed to ignore their passes. So as you’re doing this task, after about 30 seconds, say we’d have a person wearing a full body gorilla suit, walk into the middle of the scene, stop in the center, turn and face the camera, thump its chest, and then walk off the other side a total of about nine seconds later.
****: And what we find is that about half the people who do this, I. Simply don’t notice the gorilla. It’s shocking that you could possibly miss something as obvious as a gorilla. This is the intuition we have, that if something important or distinctive or unusual, like a person in a gorilla suit, uh, walks into our field of view, will automatically notice it.
[00:04:00]: The reality is that only about half the people do, uh, about 90% think they will. Looking isn’t the same as seeing we have to focus attention on something in order to become aware of it. It’s largely just a matter of a flip of the coin. We know when we’ve noticed something unexpected, but we’re not aware of the times when we’ve missed something unexpected.
****: The failure to notice people in gorilla suits is really a natural byproduct of something that we do quite well, and that’s very important to us, which is focusing our attention. We need to be able to filter out the distractions from our world and not let them interfere with our ability to do the task we’re trying to do.
****: The key is that when you’re focusing your attention on one aspect of your world, you don’t have an unlimited amount of attention to devote to other things. And we only see those things that we focus our attention on. The problem is that on occasion, we filter something that we might want to notice and we don’t realize that we’re doing that.
****: That sort of mismatch between what we see and what we think we see is a really profound one that has all sorts of consequences for our daily lives.
[00:05:00]: David: Now, the two leading psychologists who brought us the invisible gorilla, have a brand new book. Nobody’s Fool, why we get Taken in and what we can do about it.
****: Going,
****: talking to my sound, going get.
****: Christopher Chabris is a professor at Geisinger, a healthcare system in Pennsylvania, where he co-directs the Behavioral Insights team. He previously taught at Union College and Harvard University, and he’s a fellow of the Association for Psychological Science. Chris received his PhD in psychology and a in computer science from Harvard.
[00:06:00]: His research focuses on attention decision making, intelligence and behavior genetics. His work has been published in leading journals including Science, nature, P N A S, and Perception. In addition to being a co-author of the bestseller, the Invisible Gorilla, Chris is a chess master poker amateur and games enthusiast.
****: So grab your headphones, buckle up and join us to learn how to be nobody’s fool. With Christopher,
****: Chris
****: Chabris, thank you very much for, uh, coming on the show.
****: Chris Chabris: Thanks for having me.
****: David: So it’s a, it’s a great book. I really enjoyed it. And I, I, I, I enjoy your previous book as well, and I do have some, I might ask a few questions that kind of relate a little bit back to the previous book too, but, um, um, but we’re, we’re definitely on this show, not gonna be able to cover the whole thing, uh, and all the things, which is good.
[00:07:00]: Anyway. People should go read the book, get the book, and read the book to get the, the full. Full thing. So I’m, we’re gonna pick out a few, uh, interesting things to, to grab onto and talk about, uh, that kind of apply back to this show. This show is, as everybody knows, is kind of about outrage in media, outrage in society, outrage in politics, and, and, and kind of lowering the temperature and.
****: You know, there’s a tie in I think here with kind of disinformation misinformation in self outrage itself can sort of be thought of as a, as a scam, just in many cases. So that’s kind of a piece of the tie in. But I, I do appreciate you, um, uh, having us get a chance to look at this book with you.
****: Chris Chabris: Great.
****: Yeah. I think, um, I think you’re absolutely right. I, I, I hope we can, we can talk about how, you know, provoking outrage is, is sort of a. Not that different, I think from trying to scab people as you said, or trying to calm them or you know, something like that. I think it actually has a lot, there’s a lot in common that’s not obvious of like first thought.
[00:08:00]: David: Right. You know, and I didn’t really set out to do a podcast about disinformation. And of course it’s not a hundred percent about that, but it’s a topic that that comes up a lot. ’cause again, you have that overlap.
****: Chris Chabris: Yeah.
****: David: So, you know, one, I think the, starting with the, um, Beginning with the, the sort of what’s missing is, was right, kind of comes up right away with this idea of focus and what’s missing?
****: You, you, you, you, the book talks about habits and then, and, and from habits to hooks and, you know, we’re not going to probably be able to talk about them all, but as an overview, that’s kind of something that it does. And uh, it starts off with one of the habits being, excuse me, focus. And thinking about what’s missing.
****: And this to me captures a little bit of some of the, I don’t know about challenges, but something that was always, uh, in the back of my mind as I was reading through the whole book, which is finding that balance between, you know, when to, when to dig deeper and when to, when to stop. And, you know, I, I, I, you know, you talk about a lot in the book about, you know, we can’t go, if I go read a research paper or something, I can’t go, I can’t go read all the citations.
[00:09:00]: I mean, that’s, I can’t do it. All right. So I’ve gotta sort of pick and choose how deep I want to go on different things and decide when I trust. And I think that what’s missing is a little bit like that too. Like where do I, how, how do I not cross over into sort of confirmation bias and motivated reasoning to go find that what’s missing piece, you know, and just start confirming my priors and find that kind of balance.
****: And I don’t know if that’s, uh, something that maybe you can kind of talk us through a little bit.
****: Chris Chabris: Sure. So the point we’re trying to make in, in, uh, the first chapter of the book is that, uh, you know, one of the mental habits we have that normally serve us well and, and make it possible to sort of go about life in the world, as complicated as it is, as as complicated as the world is, is this habit of focus.
[00:10:00]: So we’re very good at focusing on whatever information is in front of us and processing it, you know, very deeply and elaborately, just to take an obvious example, that’s the way we’re able to follow, um, You know, a football game or a soccer game or, uh, you know, some, any basketball game, right? There’s all kinds of people moving around.
****: There’s a ball. There’s all kinds of distractions. But by focusing, we can follow the action quite well and see a lot of it. Of course, we may miss some very interesting things, you know, when we’re focusing on where the ball is or who’s got the ball or you know, whatever. But we could do a lot better than if we had no ability to focus at all, then it would all just be a big confusion.
****: But when we focus, we have this, um, unfortunate, uh, tendency to. Not even bother to think about the fact that we’re focusing. Um, so we don’t sort of think about what else would we see if we looked elsewhere, what other information should we consider? If we could get it, um, what other perspectives might be useful before we make a decision?
[00:11:00]: Sort of as Danny Kahneman, uh, puts it, um, in his, uh, in his book thinking Fast and Slow and elsewhere, we sort of operate on this principle of what you see as all there is. So we like to make decisions just with what’s in front of us. Manipulating focus and relying on people not to ask what’s missing or what else do I need to think about is really a prime tool of people who are trying to get us outraged.
****: You know, someone who wants to get us outraged, wants to point us to some particular piece of information or set of information or idea, and get us to focus really hard on that. And think about it a lot, you know, so much that we get emotionally worked up about it. Um, if we regard it as troubling and they don’t want us to think about are these really, you know, Representative examples of the trends and problems and goings on in society that this person is, is, is interested in.
****: Um, are there any other experts who disagree with this person? Are there any experts for that matter who, who know about this thing? You know, and have studied it more deeply? All kinds of questions about what’s missing. Like what am I not being shown right? What is the person transmitting information to me not telling me about that might be useful?
[00:12:00]: And, you know, that’s, that, that’s what, um, In a way, what magicians do you know, but we sort of buy into the game they’re playing. We, we shouldn’t buy into it when people are trying to mislead us or misinform us or, or provoke us in that way.
****: David: Yeah. And what, what concerns me there, to dive a little deeper on that is, is that it, it sounds a little bit like the do your own research crowd, right?
****: And so I kind of wanna make sure we keep that distinction going
****: Chris Chabris: Well, sure. So, uh, I, I think simply saying, I’m gonna get more information and indiscriminately, like type something into Google and then believe whatever you see or, you know, go to your. Friends and ask what they think or something like that.
[00:13:00]: Those are obviously bad ways of, of seeking additional information. Um, so I guess in, in the book we talk, you know, quite a bit about merely seeking more information, but also in certain, you know, in certain specific ways. So for example, if you’re given, um, some positive examples of some. Phenomenon. Um, and by positive, I don’t mean, you know, emotionally good examples.
****: I mean, positives in the sense of cases where something, uh, unusual happened and there was a interesting consequence or something like that. You should keep in mind that there are other possibilities, right? So if you, if you imagine sort of like a two by two grid, you know, and in the upper left box in this two by two grid or, uh, you know, times when the unusual thing happened and the interesting consequence also happened, and that’s what people talk about a lot.
****: Now, maybe I’ll give a more concrete example in a second. There are other times when. That that thing happened, but it wasn’t paired with the same outcome or that same cause happened. But it, uh, but that same effect happened, but it wasn’t paired with the same cause. Um, you know, a good example with this might be a politician who.
[00:14:00]: Is, you know, trying to, trying to get people, uh, outraged about, uh, immigration might point to cases where immigrants committed crimes. In fact, probably heinous crimes or something like that. And they’ll tell you a, you know, a bunch of stories about that or maybe just one very vivid story, but they won’t tell you.
****: How many times the same kind of crime was committed by non-immigrants or how many immigrants didn’t commit that crime, or, you know, for that matter, you know, everything else that there might be. So you could get a completely distorted and outrage provoking, you know, notion of, of sort of the frequency of this co-occurrence, immigrants and crime.
****: And so the first thing you should be seeking when asking what’s missing is, What’s going on in those other, you know, boxes? We call this the Possibility Grid in the book, you know, we’re often directed towards only one possibility. Immigrants are committing crimes. In this case, we are not told anything about the other one.
****: So that’s the first place you should look. Beyond that, I think you should look for people whose opinions differ. You should really try. I know it’s hard, but you should really try to ask, you know, what does the other side say about the same thing? Um, what do you know? Neutral, you know, fact finding groups like who compile statistics and so on.
[00:15:00]: Say about the same thing. If you don’t wanna listen to what the, you know, the other party says about immigration, you could maybe look at crime statistics. Those are published, you know, there are research papers about them. You know, there’s, there, there are many sources of information that. Aren’t the first thing that pop up when you type something into Google or ask your, you know, ask your friends who verbally already agree with you on a lot of, on a lot of things.
****: David: Right. And that’s, you know, and I, I, I should have mentioned that earlier about the book too, because the books, um, it’s not, it’s not super short, but it’s not super long either. But it’s a really, uh, easy read. It’s, it’s, there’s great examples. Fun, fun stories. Um, and, and I, I know I, I kind of broke each. Each chapter seems to break down into about one Peloton ride.
****: So that, that works. Good.
****: Chris Chabris: Good tip. Yeah.
****: David: So you, it makes, it, makes, it, makes it go well. Um, you know, and I, and, and in fact in chapter four that seemed kind of related to what you were just talking about there, which is efficiency. You kind of talk about, you know, we don’t wanna fall in love with this new information we get, you know, and, and, and I think that ties a little bit into what you were just saying.
[00:16:00]: Chris Chabris: Yeah, there’s, there’s some interesting studies. Some of my favorite sort of cognitive psychology or decision making studies where, uh, participants of the studies were divided into two groups. And one group was given, let’s say three pieces of information relevant to making a decision. And the other was given two of those pieces of information and asked whether they wanted the third.
****: Um, so the only difference between the groups was, um, the option to, you know, Acquire the third piece of information or proceed without it. Um, and of course not everybody asks for the third piece of information, but the people who do ask for the third piece of information now have exactly the same three pieces of information that the ones in the other group were given for free without even asking.
[00:17:00]: And yet, What happens in some of these experiments is that third piece of information is weighted much more heavily in the decision making process by the people who asked for it. Um, and these are, by the way, these are often experiments with professionals who are supposedly used to making a. Rational expertise driven decisions based on information they have, and also used to the process of getting more information before making a decision.
****: And yet even so, they’re sometimes overly influenced by information that they had a role in acquiring. And this is the most minimalistic role you can have in acquiring all information. All you basically have to do is say, yes, I’d like to have it. Imagine how much more. You might value the information if you had had to work to get it.
****: I mean, even just typing in a clever Google search or, you know, going and looking for something. So it’s important when you do seek more information, not to assume that you know the, the new information you got is, Is the most critical thing I, I should say, that often. I think when we do seek more information, it’s probably because we’re a little bit on the fence to start with, especially in like a consumer decision, right?
****: Like we’re not sure whether we want product A or product B. We get one more bit of information and that may be, you know, that that’s a little more rational. Because if we were already sort of indifferent, well the one more piece of information might really change, you know, might really change our minds.
[00:18:00]: But if we’re trying to. Make a, you know, a rational or a considered decision about some, you know, policy choice or some investment or something like that. You know, where a lot is at stake or where we really wanna really want to get it right. Um, we have to be careful not to overweight whatever. We went out and, and found ourselves.
****: Yeah,
****: David: I think we’ve all, um, we’ve all had a boss that, um, whoever talked to him most recently is, is what, so everyone’s looking around going, who talked to him last said, why are we doing this crazy thing? Who do, who do you go talk to? It’s like, I get to talk to him last.
****: Chris Chabris: Yeah. And well, exactly. I mean, that’s, that’s, that, that’s the result of another, you know, another cognitive limitation we have that.
****: Also makes it easier for people to, to scam us and cheat us, which is just limited memory, right? We, you know, our memories are, are, are considerably worse in a number of respects than we realize. And, and, um, uh, you know, one, one of them is that we often tend to, you know, overweight what we recently acquired, kind of like we overweight what’s right in front of us or what we had a role in acquiring.
[00:19:00]: Uh, it’s all, it’s, it’s, it’s all sort of related to the same, um, contrast between. The cognitive limitations we have, you know, which are, you know, again, which are sort of necessary in order for us to move forward in the world. We can’t exhaustively analyze everything, but we have these cognitive limitations, and yet at the same time, we often act as though we’re not aware that we have them.
****: We, we think we had more information, we think we’re having an unbiased view, you know, et cetera, et cetera.
****: David: Yeah, exactly. Well, and that kind of brings me up to something I, I wanted to talk about a little bit, and it comes up in this book, but it was a, a, a sort of a feature of your previous book, the, the, uh, invisible gorilla.
****: Uh, and that’s that like a lot of, but, but I wanna talk about a little bit because I think this idea persists and that’s this, this idea that we sort of, that our memory kind of works like a hard drive and we just sort of, we turn on the switch and it, we just records everything and we get, we play it back and we get a perfectly good version of, of that.
****: Yeah. And I, I just kind of want to, maybe you can talk about, talk us through that a little
[00:20:00]: Chris Chabris: bit. Well, that yes, that’s, that’s absolutely true in, in the Invisible Gorilla. Um, we had a chapter illustrating sort of many ways in which our memories are more fallible than we realize. And we actually did a, a couple, a pair of sort of public opinion surveys, um, around the time we were writing that book, and actually asked a representative sample of the American, uh, public.
****: Uh, do you agree with the statement memory works like a video camera accurately recording, um, things so we can play them back later and see them? That wasn’t the exact wording of the question, but it was, is there sort of a sentiment with that video camera analogy or, as you put it, a, a hard drive, uh, and a clear majority of people agree with that statement, even though almost every memory expert you would talk to would completely disagree with that statement.
****: In fact, that’s almost. The fundamental finding of memory is that it doesn’t work that way. That’s sort of like the fundamental fact that, you know, that we’ve come up with in about 150 years of, of scientifically studying memory. And so it’s one thing that our memories are limited. It’s another thing that we act as though they aren’t.
[00:21:00]: Uh, and so you get, you can get all kinds of interesting, um, and, and somewhat troubling, um, phenomena from this one that we talk about in, um, the new book. Nobody’s Fool is the Mandela Effect. So the Mandela Effect is this sort of belief that. If we, uh, remember things from the past that don’t seem to be correct, according to the evidence we can find right now, it’s plausible to think that our memories are actually correct and the evidence has been changed to cover it up.
****: So this arose when, um, a, uh, a woman, uh, had a memory of Nelson Mandela having died in, in prison in South Africa, in, in, I believe, the 1980s. And she couldn’t understand why. This was not in the, you know, in the, the history sources. You know, and, and, uh, and, um, it might have dissipated, like she might have moved on, you know, if it weren’t for the fact that she found other people who had the same false memory.
[00:22:00]: I. Once you have a community that sort of reinforces a belief, even a belief is outlandish, as my memory is correct in the entire internet and all the books in the library, and all the newspapers and all the video and all the other people who, who have a different memory are all wrong. This can create sort of like such a strong commitment.
****: You know, this belief that my memory is, is, is, um, infallible or at least, you know, uh, must be accurate on this point. That you could invent all kinds of, you know, weird beliefs to justify it. Mm-hmm.
****: David: Yeah. You know, and, and I always worry anytime people start talking about their convictions or they, they get too committed to an idea is that’s when you, you start to not be open to, uh, to new information.
[00:23:00]: Chris Chabris: Yes, that’s that. Yeah. And in, in the, uh, in the book we refer, you know, as, as, as I may have mentioned, we refer to those as commitment. So we, we, we coined this, tried to coin this term called commitment, which was just sort of like an assumption, you know, that is not necessarily true. Um, but once you believe it strongly enough, it, it sort of controls where your reasoning can go from there.
****: It really highly constrains your reasoning. So much so that you wind up in. Cul-de-sacs, like forked timelines and, you know, erasing all the library books and so on.
****: David: Yeah. Well, so I, I wanna make sure we talk a little bit about some of the, some of the stories. You know, one of the stories that you talk about is, was, uh, uh, I think it was an intentional research project that they invented, uh, this, they invented this whole thing, and, and as soon as I heard the mention of the name Fredonia, I knew immediately that the thing was fake because I was like a Groucho Marks, you know, Freak when I was a, when I was a kid.
****: So I knew, I, I, I knew right away that that was fake. But I mean, if you didn’t have that context, it might might not happen. So can you tell a little bit about that story?
[00:24:00]: Chris Chabris: Yeah, that’s a great story. It’s, it wasn’t really a research project. It was actually a kind of a, a journalistic hoax. The, um, there used to be a magazine called Spy Magazine, which, um, Some people may remember that, uh, sort of, uh, among other things, specialized in occasionally hoaxing politicians and celebrities.
****: Uh, and in this case, um, uh, so I, I wouldn’t really call it a research project, but it sort of wound up being a, an interesting demonstration of, of, um, uh, of this phenomenon. They asked a bunch of, uh, newly elected congresspeople, I believe, around, uh, 1993 or so. A question like what should the United States do about the ethnic cleansing in Fredonia?
[00:25:00]: And, uh, Fredonia is a fictional place. Uh, it’s actually from a Marx Brothers movie from the 1930s, I believe. And yet a very large number of incoming US congressman gave a completely coherent answer as to what they thought should be done about the ethnic cleansing and, and Fredonia, uh, and. It could be that they were thinking of Bosnia or some other, you know, some other place where there, or Somalia where there were sort of issues of ethnic rivalry and uh, and ethnic cleansing and genocide.
****: Um, but uh, it was quite impressive how they just went with it. And, uh, you know, that’s often that kind of problem can arise from. Hearing something familiar and sort of assuming that we know about it or we understand it, or, you know, it rings a bell and, and therefore we’re just gonna, you know, sort of go ahead and, uh, you know, and, and, um, uh, and, and respond to it.
****: So, Fredonia, maybe people had heard of it, you know, it kind of sounds like it could be a country or a province or something like that. Maybe they’d heard the name because of the movie. Um, and, uh, I think there’s actually even a town in New York called Fredonia maybe, or something like that, but probably not one where ethnic cleansing is or was, uh, is or was happening.
****: Um, right. So, uh, yeah, and that’s a, it was, it’s a great, um, it’s, it’s a great example of, uh, it’s a great example of a, of that kind of,
[00:26:00]: David: Yeah. Uh, so, you know, I, jumping around a little bit, um, the, the, I, I thought something that also seems to apply kind of to the, the whole outrage, you know, people trying to invoke out outrage on this is, is this kind of like, could it be true?
****: Could this be true or there if, you know, can I prove it false kind of thing? Because often it seems like these stories and maybe it’s kind of a mix of many of the, um, Yep. Things going on in the book, but, but it’s, it’s sort of like that one seems to kind of apply there sometimes too, as you get these stories and, you know, and particularly, and something also that even if it is true, sometimes it’s just a, it’s still not a good, um, representation of reality.
****: Chris Chabris: Yeah. One, I mean, one, one point we do make in the book is that, Uh, successful long running frauds, scams, influence campaigns and so on, probably exploit a lot of the cognitive habits that we talk about in the book. Um, even something like Bernie Madoff’s Financial Fraud, it had a lot of different facets to it, but uh, it really also exploited a lot of the tendencies we have.
[00:27:00]: I know it’s not like Bernie Madoff set out to check off a checklist of what are people’s psychological flaws and how can I do something that. Maximally takes advantage of them. He just, you know, through his own intuition and cleverness and so on, wound up, wound up doing that. Um, now, uh, I, I think one, you know, one uh, aspect of, uh, the whole sort of, you know, like people are saying, um, you know, putting out false information, you know, and so on, that, that makes it more successful often is making it precise and concrete.
****: So, you know, we all know that like specific stories, you know, are very good at persuading people and, and often, um, stoking outrage, the concrete and the specific names, faces, you know, physical actions and so on. Those all. Actually, you know, probably activate different parts of our brains and create multiple routes to, um, getting things into our memory.
[00:28:00]: Whereas abstract, um, arguments and abstract concepts are a little weaker, um, at, uh, at doing that. So as soon as you also put like, numbers on something you’re complaining about, right? Or you’re trying to get people to, to, you know, to care about that also helps because then it becomes more precise precision.
****: The idea of precision is that, you know, it makes sense for us to. Put more belief in precise and, and concrete information. The more precise a prediction a scientist could make, for example, probably the better they understand the phenomenon that they’re, you know, that they’re modeling and, and talking about physics is really good at this, right?
****: Um, you know, we, we know exactly, you know, how, how long it takes certain things to go around the sun and so on. We know down to this, you know, microsecond and so on. They have a really good model of how these things work. Um, the problem is, The fact that their model led to precision doesn’t mean that whenever we encounter precision, it reflects someone having a really good model or a good understanding of things, because people can make absurdly precise claims without having any idea what they’re talking about, or even as part of a deliberate attempt to, you know, to deceive us.
[00:29:00]: So some of the hallmarks of, you know, Persuasion attempts that we should be especially concerned about are ones that involve a lot of concreteness and precision. One that involve claims of ones that involve claims of large effects from small causes. If we just, you know, allow this small thing to happen, then everything’s gonna change as a result.
****: You know, our civilization will be over. Um, you know, I could throw in more political phrases along those lines and so on. Um, tho those are, you know, those are some, um, and also, um, Consistency and repetition are also, you know, critical in this. So the, the more you know, the more someone says the exact same thing, I think the more we should be worried that they’re trying to convince us of something that might not, that might not be correct.
****: Again, there, there’s logic in sort of repetition. Increasing our trust. It’s called the illusory truth effect. Sort of repetition increases our belief that something is true. There’s, there’s some logic in that. For example, if we heard it from many independent sources, it would make sense to increase our belief in it.
[00:30:00]: But if the same source keeps saying it over and over again, that’s that, you know, that is really something we might wanna watch for. Mm-hmm.
****: David: Yeah, and that reminds me of another thing, uh, from the book I was thinking of that, that seems to apply a lot. I, I, I, um, I get a lot of people sort of finding themselves, you know, really downcast and really having a bad outlook on life, and then feeling like, you know, that, that, that have, looking for all these worst case scenarios and, you know, um, and you talk about beware of worst case predictions and there’s always, almost always an even worse case, which I, I, I don’t want to have people take it the wrong way, but I mean, in a sense that we can find ourselves focusing on focusing on things that maybe aren’t gonna happen.
[00:31:00]: Chris Chabris: Yeah, that’s true. So, um, a good model, a good model about something or a good theory or a good forecast right, will give a range of possible outcomes. And maybe even a weighted distribution like this is most likely, this is less likely, that’s even less likely, and so on. But we don’t think about those things very well, right?
****: We think better about precise specific concrete things. So what’s the worst case scenario, right? Well, when you ask that question, It’s not dumb, right? Because it makes it’s rational to wanna know what’s the worst that could happen. But the worst case scenario is easier to remember than the fact that it’s a, you know, one, you know, 10th of 1% possibility.
****: And, you know, 90% of the possibilities are much better than that. Um, or, uh, it may anchor you. You may actually believe that that is the worst. When in fact it’s only the worst case as predicted by some kind of model or forecast, right? There are, there are worse things, you know, there are even worse things that could happen, especially, you know, at, at times of great uncertainty.
[00:32:00]: What’s the worst case scenario of the war in Ukraine? You know? Well, it could go pretty far. Right? Right. Um, uh, and, uh, you know, we. Sorry. We’re generally pretty bad at, at, at forecasting things, but I think we’re even worse when we try to give precise forecast. Right? That that’s psychics. This is what messes up psychics all, all the time, right?
****: They say, oh, I know exactly where I know exactly where your child is. Right? And of course, they don’t know, you know, but that makes them more convincing, right? They, they, people pay attention to them because of that, but at the same time, they’re always wrong. Um, you know, because of that too.
****: David: Yeah. You know, and I, yeah, and I think you’re right about these complex situations and, and you know, this was obviously a challenge with something like, you know, COVID, especially in those early days, you know, and, you know, trying to make predictions then and, and, and in hindsight saying all those predictions, you know, were terrible.
****: It’s like, well, you know, what did you expect? You know, I mean, of course there were terrible.
****: Chris Chabris: Yeah. I think that the covid revisionism, I don’t know if that’s really a term, but I’ll, I’ll make it up sort of the covid revisionism where I think sometimes I. It maybe suffers a little bit from a memory problem and that they, you know, people don’t remember like what were the big issues at the time.
[00:33:00]: And, and one of the big issues at the time was overwhelming the healthcare system. Um, and I. You know, as well as merely the, the spread of disease and so on. Um, but also there was uncertainty about, about how, you know, there there was genuine uncertainty about how serious the, this virus could be. And, you know, it’s easy to, you know, well, it’s easy to say and it’s, it’s easy to say that hindsight is 2020, but especially in these kinds of cases, I think it’s, it’s, it’s more dangerous to sort of look back and say, oh, well, in light of what we know now, you know, we should clearly have been doing things differently way, way back then.
****: That’s, that’s too much. That’s too much confidence,
****: David: right? Yeah. So, um, yeah, so I don’t know if there’s any other, you know, tie-ins that you think would be real that I kind of missed. You know, I have a long list of other things we could talk about, but I, I don’t know if you, if, if you’ve thought of any sort of tie-ins, kind of back to, uh, some of this media reporting and, and sensationalized stuff that can always try to get us riled up and stoke our fear and anger.
****: Well, I, I
[00:34:00]: Chris Chabris: think, um, I think the, the concept that we sort of start the book with, which is, um, it’s in one sense kind of obvious once you hear it, but I think it’s actually more, maybe more counterintuitive than we realize. I, I think, can do a lot to help us understand why it’s so easy to, to stoke outrage.
****: Um, and that concept is truth bias. So it’s a very simple idea. It’s the idea that. When we process incoming information, our tendency, our, our, our habit, the way our brains are designed somehow, is to tag it as true as soon as it comes in and only to retag it as false or uncertain or unknown or needs checking or any other kind of label you want to give.
[00:35:00]: It takes more time, effort, and. Discipline sometimes. So if you are, uh, in a hurry, if you’re distracted, if you don’t have time to think about it very much, it may feel to you like it’s just, you know, like you’re, you’re not, you know, thinking that it’s true, but there’s a bit of a bias to think that it’s true and that will sort of make it that much easier next time you hear it, you know, to think, oh yeah, that’s true.
****: Right? You know, it’s sort of, we, it can, it can potentiate that illusory truth phenomenon if you automatically tag things as true. Uh, when they, when they come in the first time or the second time or the third time. Um, and again, like all the other stuff we talked about, there’s a reason for this. This is not just a bug in the system.
****: The reason for it is that, well, one possible reason for it has to do with just with sort of communicating with, with other people and coordinating actions if you constantly distrusted everyone and didn’t believe anything they said. You would, you know, suffer some kind of cognitive overload trying to check everything or be paralyzed into inaction because you wouldn’t have any beliefs, right?
[00:36:00]: You, you need beliefs, you need, you know, you need some propositions to be in your, you know, in your mind to sort of take action. And if you were constantly questioning all of them or maintained uncertainty about all of them, you, you probably couldn’t take action. So anyone who’s trying to get us outraged is trying to, you know, stoke division and so on, can also count a little bit on.
****: No matter how outrageous the thing they say is, right, for at least a brief moment, you know, we will sort of, you know, hold it as true. Um, and I have to, I have to say I’ve, I’ve felt this, I’ve experienced this personally. There have been some things in politics that to me have thought I. Have seemed like the most ridiculous thing in the world, but, but once you hear it, like five times that you start to, you know, you, you, you, the truth bias and familiarity and illusory truth effect and all of that, um, it starts to make you take it more seriously.
****: Like, oh, maybe that, maybe that could be possible, or something like that. And I think, you know, unless you. Unless you really try to introspect a little bit on your thought processes and on how, how did I come to believe this now? Who did I hear this from? You know? And, and where’d that information come from?
[00:37:00]: You, you, it really does take some effort to do that, but it’s rewarded, it’s rewarded effort in the information environment that we’re in nowadays, right? Where there’s, there is so much stuff that’s coming at us that could be misinformation, disinformation, propaganda, and so on. From more sources and, and, and, and through, through channels that normally we would trust, right?
****: Like our friends, right? Our Facebook friends, our, our other friends, our relatives, and so on. It really pays off even more to, to be aware of that bias and to try to, uh, to try to check it a little bit.
****: David: Yeah, and it, it’s, it’s hard. It’s always hard to find, you know, that balance and, and also be, like you said, there’s just only so many hours in the day, so you can’t check everything at the same time.
****: Uh, you know, you need to find that balance to be able to do it. Yeah,
****: Chris Chabris: you, you’ve gotta have, you’ve gotta budget your budget, your checking to what’s really important. So, you know, you might get scammed by like a mispriced item in the supermarket. Don’t worry about that. But don’t get scammed by your financial advisor, or don’t get scammed by a political candidate who’s, you know, telling you all kinds of lies.
[00:38:00]: Um, you know, don’t get scammed by claims that something is happening somewhere far away that you don’t know anything about, but you just believe you know people because they’re related to you or familiar to you, or they’re on your side or something. Those are the times to, to, to really go into it more.
****: David: Right. That’s a good point. Right. Weighing, weighing the consequences.
****: Chris Chabris: Yep. Yeah, exactly.
****: David: All right. Well, um, I’m really, really, uh, wanna thank you again for making time and Sure. I really enjoyed our conversation. Um, I, I’m, I, I enjoyed reading the book and I think, uh, I think our listeners will as well.
****: Chris Chabris: Well, I enjoyed the conversation too, and, uh, yeah, I hope, uh, I hope everyone does go and check it out.
****: Take
****: David: care. Bye-bye. Bye.
[00:39:00]: That is it for this episode of the Outrage Overload Podcast. For everything we talked about on this episode, visit outrage overload.net. Before we go, I have a quick. Favor to ask. You know, reviews mean the world to us podcasters. They help us reach more listeners and continue bringing you thought provoking content.
****: So if you have a moment, I’d be thrilled if you could head over to pod chaser.com and leave a review. I’ve made it super easy for you. Just visit pod chaser.com/outrage overload, and let me know what you think of the show. There’s also a link in the show notes. I read every review and your feedback truly matters.
****: Until next time, stay curious. Stay kind.