Misinformation Nicole: [00:00:00] Hi everybody and welcome to the Woman-Centered Health podcast. Today we are speaking with Dr. Sarah Gorman, a public health and social science expert, and c e o of Critica about misinformation. Sarah is also the author of Denying to the Grave Why We Ignore the Facts That Will Save Us with the second edition released in 2021 and is currently writing another book that is examining the relationship between healthcare access and mistrust, and the rise of conspiracy theories in the medical sphere. So we’re probably gonna have to have you back on Sarah to talk about that book too. but before we start our interview, we wanna thank all of you for listening and let you. Now you can earn CEs and get key takeaways, resources, and transcripts by visiting our website, woman-centered health.com. Also, Stephanie and I put this podcast together in our free time and often use our personal funds. So please consider supporting us if you can support us. You can do that by subscribing and giving us a five star rating on iTunes or wherever you listen to us. [00:01:00] And if you’re able to offer a financial donation, visit our website and click the support us tab. All right, let’s meet our guest. Stephanie: So, hi Sarah. Thank you so much for being a guest on our podcast today. So first, could you provide a little bit of details about your background? Sara Gorman: Sure. First of all, thanks so much for having me. It’s a pleasure to be here. I like to say that I got interested in misinformation before it was cool. Uh, about 10 years ago, , I became very interested in the anti-vaccine movement and the reasons why parents were not getting their children vaccinated against measles, mumps, and rubella. And at the time there was not very much of a psychological literature about why this was happening. Most. studies on this focused on how parents were mi maybe missing some of the facts about vaccination and why we needed to make the threat of some of these diseases more salient to them. And I really thought that these people, many of them were very well [00:02:00] educated, had access to good health information, and in many cases, explicitly knew that correct information about the vaccines, but still were resisting this health technology. So I set out to understand what are the psychological underpinnings that really prevent someone from following health recommendations that are based in evidence. And as I went along writing articles about this topic and getting a lot of responses, I realized that there was a book about this topic that focused on, not just on vaccines, but gun ownership. antibiotic overuse and many other topics including climate change that are in denying to the grave. So I wrote my book Denying to the Grave that came out in 2016. And subsequent to that, people were really asking me, what are you going to do about this problem in real life? And so that’s when I founded Critica to focus research on understanding how we can counteract misinformation in health and science, and also how we can help people make healthier [00:03:00] and more evidence-based decisions about their health and wellbeing. And now I’m working on a new book, as Nicole mentioned earlier, that’s really focused on some of the structural underpinnings of the crisis around trust, in healthcare in our country. Now. Stephanie: That’s awesome. the next question you might have answered part of that, but I’d like to ask it again in case you have anything else to say, but we always ask our guests what informs your perspective. So in other words, why do you do what you do and what is most valuable to you? Sara Gorman: I do think one of the reasons I got so interested in the psychological underpinnings of the misinformation movement is because, For one, I have two psychiatrists parents, so I grew up in a household where we were always really thinking about what was behind every action and thought that anyone ever had, and it was just came naturally to me to think about what could really, what is really behind this on a psychological [00:04:00] level? So that, I think that’s definitely one factor. I would also say I’m formally trained in public health and a lot of the work I’m doing now, both at critica and in my new book, has to do with looking at not just misinformation, but information environment. So what are people being exposed to and how does that differ across different demographic? Different populations of people across the country and different communities online. Where are people in bad environments where they just don’t have access to good information for whatever reason? Where are people being targeted by disinformation, peddlers, and how does this look as a system, not just on an individual level? And that is a very public health approach to a problem. So I think a lot of my thinking developed from my training as well. Nicole: I feel like I can only imagine what that would be like to be raised by two psychiatrists. Stephanie: I know. I was like, that’s a lot to unpack. Nicole: I know Sara Gorman: Yeah,[00:05:00] Nicole: conversation and cuz that is just so not how I was raised. Stephanie: same Sara Gorman: yeah, we can definitely talk about that too. At one point, it was interesting. Nicole: Well, I think we need to loop this in. I, I forgot to ask this question. So, so I’ll, I’m gonna segue and then I’m gonna ask this question before we get into our other stuff. Okay. So, like we said, today we’re gonna talk about misinformation slash jumping right in, in, and I realized that we forgot to include a question about Critica, which also links back with your parents. And so I would love to hear and to share with our listeners, you make a plug for Critica, what that is, the work you’re doing with Critica. And then, and then how does that relate to, to your dad, especially Sara Gorman: Sure. So Critica is a nonprofit research organization that focuses on finding new ways to counteract misinformation with a heavy focus on online environments, social media, and also helping people make more evidence-based health [00:06:00] decisions. So we do a lot of research, as I said, we’re funded by big foundations in the US to, to look at these problems and to actually try out interventions online and evaluate them. We also spend a lot of time writing public commentaries for public consumption about issues where there are debates or confusion about the evidence. and I would say, The connection to my dad is that I actually wrote my first book with my dad and pub and founded Tica together as a result of that, and he’s been doing a lot of work for the organization while I’ve been very busy writing my other book, which is a solo venture, and now I’m, I’ve come back in and I’m, now that I’ve finished a draft of my book and I’m taking over some more of the direction of the organization, but it’s just been a pleasure working with him. People always ask me how, how does that go? Is that difficult? And it really isn’t. He’s a very. Responsible responsive person, which is great because you never know even with a family member [00:07:00] and he’s very easy to work with. He also just happens to be completely brilliant. I think everyone agrees with that. So he is very creative and has a lot of experience from his career working on grants and writing grant proposals. So he’s been obviously just an invaluable person to work on this with. And we’ve grown a lot. So we have many other staff members now besides the two of us. But in the beginning it was really just us meeting at, at my parents’ house, uh, on a Saturday morning, just sort of strategizing about what we were gonna do next. Nicole: I just so, so love that. And for our listeners, you know, obviously it sounds like what you’re doing is very relevant to them. Is there some things that, you know, for our listeners where they can, like what type of stuff could they use from Critica or how can they maybe incorporate what you’re doing with Critica? Sara Gorman: I would say if you are a general concerned member of society about this issue, you should check out our commentaries. Our website is critica science.org. You can check out our commentaries. We have, we publish three a month [00:08:00] and we also publish something every month on Psychology Today. And you’ll see that there are a lot of issues discussed there that you may come across in talking to other people where there are debates about the Covid vaccine and it’s safety, about abortion, about any topic that, where there is some contention around the science and people are sort of not agreeing with the evidence. So you can find information there that I think is useful as talking points. I think if you work in the public health field, it’s, it’s a whole. Set of opportunities with us because we actually train and offer technical assistance to health departments and people in the public health field to be able to evaluate information environments, deal with misinformation in their, in their careers and their lives at their jobs, and to be able to incorporate more behavioral science to, to create better health for anyone they work with, whether it’s the general population or client base. And we help [00:09:00] people evaluate, understand what interventions they might need to put into place, how to communicate better with the general public about health. And then we offer a lot of handholding after that. Stephanie: So another thing that we didn’t mention that I’m gonna throw in on top of this is how we know Sarah. and that is, through, well now it’s called those nerdy Girls only, but it was, called Dear Pandemic. And Nicole and I, I kind of looped. Into this group because I followed Dear Pandemic during the early parts of the pandemic and would send their, their post to like everybody I knew. And then with the Roe v. Wade, the Dobbs decision, they asked for, some reproductive health experts and that’s how we got looped into Sarah and Nicole and I have written one post and I’ve written one. we’re not as active as Sarah is, but that’s just another venue to get the, the scientific [00:10:00] based information in the hands of the people and, I’m sure. Now that we’re thinking add to our to-do list to interview the founders of it. Um, but Sarah’s just one of, probably many of the first of many, guests that we’ll probably have from, from those nerdy girls. Sara Gorman: Yes, they post every day, so they have a lot of content on a wide variety of topics. I’m very focused on covid and infectious diseases, but as Stephanie said, also reproductive health in some other areas. So it’s definitely something to check Stephanie: Yeah. And are, they’re on all social media basically. Those nerdy girls. Okay, Nicole Nicole: now we’re actually gonna get into the questions we plan to ask you. Thank you for playing, you know, impromptu interview with us Sara Gorman: my pleasure. Nicole: So there is clearly a lot to unpack and discuss about misinformation. So let’s start out kind of broad. Can you share with our listeners what is misinformation? Sara Gorman: The best [00:11:00] formal definition I’ve seen of misinformation is information that is false, inaccurate, or misleading according to the best available evidence at the time. So that last part is something that was actually added to that definition later. It’s very important because the evidence, as we know, especially with something like Covid 19, can evolve. So misinformation needs to be understood in the moment as what’s evidence based and what is deviating from that. There’s also something called disinformation, which probably a lot of your listeners have also heard of, which has to do with really the intentional spread of. Or inaccurate information. And in those cases, we do see, as I said earlier, that there are certain vulnerable communities in the US population that get targeted by people who want to spread lies about, about health and science. And in the case of reproductive health to used just [00:12:00] one. We’ve been hearing a lot lately about, the population of Latinx women who are being really targeted by misinformation about, uh, reproductive health and abortion and related topics. So we do see that disinformation is an intentional way to dissuade people from believing the facts and the evidence about something. And there can be very concerted targeted efforts to get at certain populations. Stephanie: Can you talk a little bit about that Latinx? what kind of misinformation is, is being spread about, reproductive health? Sara Gorman: So some of the most common ones, and these things happen shortly after the Supreme Court decision are things like abortion is bad for your health. So if you get an abortion, you’re more likely to get breast cancer or some other kind of cancer, which we know is not true. Or another dangerous one is [00:13:00] abortions are reversible, so people will be told to try sort of different treatments that are made up basically, and there’re sold these treatments. So there often is a profit motive for disinformation and It can be dangerous depending on what people ingest and really take after they’ve had either, you know, a medication or. , another kind of abortion. So those are some of the big ones that have come out in the past, just in the past few months, that are really, that have been targeted, not just at that population, but we do see there’s also an absence of fact checking sources in Spanish. So it’s harder for Latinx populations to have access to the correct information than it is for people who speak English more fluently. Stephanie: I have heard those, so just curious if there was something else. okay, so we have this newer concept that Nicole and I have heard recently called infodemics. So can you share with our listeners what is [00:14:00] an info demic and How does this relate to misinformation? Sara Gorman: one of the key features of an info demic has to do with the amount of information, so usually an info demic is used to refer to a very high volume of new information at a fast speed. that’s coming to the public. And this was obviously very true during the Covid 19 pandemic. It’s not necessarily in and of itself dangerous, but the problem is that it’s too easy for people to twist that information, and it’s too hard for the public health authorities to process all of that information quickly enough to communicate it to the public in a way that’s understandable. So what you get is an immediate onslaught of all this information. There tends to be a lot of confusion because it’s not translated well for public consumption. And then that creates an opportunity or an opening for, especially for people who want to, who want to spread misinformation [00:15:00] for those disinformation peddlers, to create stories based on that, when there’s a lack of understanding, there’s uncertainty and there are questions that aren’t being answered, that creates an opening for misinformation. So part of what I tend to advocate for is for there to be. Infrastructure in this country to actually do surveillance ahead of time to understand what kind of information is going to be coming at people. And can we try to make it understandable in advance or at least give people some sense of the boundaries of what that information is, because the way we do it now is just unsustainable and the public health community can never get there quickly enough. Nicole: And I don’t know if, if this would be helpful too, like, is info demic then, is that all information via like social media? Like is there a particular avenue? Is that print word of mouth? Like what is included in that? And then how, [00:16:00] I guess I’m curious too, in the surveillance piece, what does surveillance of all that look like? Sara Gorman: So the info demic definition includes both online and offline information, but as you can imagine, people are mostly getting their information online. So that is an important place to focus. In terms of the surveillance, I think, I’ll give you an example. There’s a new vaccine that’s going to roll out soon in Sub-Saharan Africa, which is the malaria vaccine. And at the moment, and this is a situation where we do know that this is going to happen and there are places where there might be concerns, or hesitancy about getting the vaccine. And so what we would like to see ideally is a little bit more social listening and understanding what information is and isn’t being passed around about the vaccine. Already some active surveying to get a sense of what people do and don’t know about it. . [00:17:00] And then interventions that include things like simple information, giving people the accurate information about how many doses it is, how efficacious it is, these things can, could become distorted. So it’s important to get those out early and repeatedly. But then also this concept of inoculation or pre bunking where you can expose people to either the type of misinformation they might see about the vaccine, or just some rules of thumb about how people who spread misinformation tend to communicate and what their strategies are so that people can be on the lookout. And that is actually an evidence-based method that’s been studied in many studies to show that when people have some heads up, it works like a vaccine. They experience the misinformation in a diluted way, and now they kind of have a chance to think about it. And understand how to debunk it. And then when it actually approaches them when they’re [00:18:00] either stressed out or not paying enough attention or they’re, they’re trying, they’re getting too much information, they already know how to respond to it, and so they’re less likely to be taken in. So that’s the sort of process that I’d like to see with every, you know, we have an RSV vaccine that’s going to come out soon. Are parents likely going to get their children that vaccine? And what concerns might come up about that vaccine that we could find out about now versus waiting until it’s out and nobody’s getting it. So some situations it’s harder to do that. Like Covid kind of came out of nowhere and it would’ve been hard to do sort of advanced surveillance, so to speak. But in situations where we know there’s a new technology or even a threat that’s on the horizon, I do think there’s more we can do to control what could eventually become an info demic. Nicole: This is so fascinating. it. Loving, nerding out about this Um, in what ways, [00:19:00] why do you think that health and science as topics are more prone to misinformation than other areas of knowledge? Sara Gorman: That’s a great question. I was actually just reading an article that said that in general, when you try to correct misinformation in any field, , if you alert people to the credibility of the source, it doesn’t help. So if you say, this person is giving you correct information now and they’re an expert, it usually doesn’t work. But the exception is that it sometimes works better if it’s a health or science topic. So, because there’s very high levels of trust in doctors and healthcare providers in in, at least in this country. So that’s the positive. I would say, in answer to your question about why does it seem prone to misinformation, I think there are two big areas that experience a lot of misinformation in their health and politics and in the health arena. Not gonna comment on the policy politics arena, because I’m not an expert on that , but in the health arena, in the [00:20:00] science arena as well, there are a couple of issues, and this is a lot of what I uncovered in my first book. One is that, There’s a lot of uncertainty in health and science that’s just inherent in the way that health and science proceeds. It takes many studies and people falsifying the, the information over time to come to a conclusion and a consensus. So part of the problem we have now is that people are, who are not trained in science, are looking at single studies and not understanding whether that’s in line with the consensus, whether it’s an emerging area. They don’t understand necessarily that they may not, they shouldn’t make too much of one single paper. So that’s one issue. I would say. The other issue with that is that human beings are inherently uncomfortable with uncertainty, and there’s no answer. That’s a hundred percent in health and science, in part because we study things on a population level, but you’re dealing with individuals and those two [00:21:00] things aren’t the same. So you can tell someone in general, on average, this drug is safe. . But in X percent of people there might be this reaction and people don’t know how to interpret that. What? What does it mean for them? So the translation there is very difficult and there’s always uncertainty. At the end of the day, no medical doctor or any other healthcare professional can ever tell you something, a hundred percent of it, true or false. So you have to deal with knowing that it’s not, you can’t know for sure. And then I would say the issue with the way that it studies proceed in health and science and the way our brains work it, they’re very at odds. So like I said before, , we have to sort of withhold judgment and wait until there are enough studies to build consensus. But human beings tend to, when they see some, a conclusion or something to draw, they tend to make up their minds quickly and then they have trouble changing their minds. So [00:22:00] the problem is that you see a conclusion from one paper and you decide youth believe X. And then another paper comes and says, actually that wasn’t a good result. We did a better study. We think it’s actually this. But by now your brain is really, has a tendency to hold on to whatever your original belief was. So your brain does not, is not really compatible with the pattern that science uses to come to conclusions. And that is a big problem. And I think that and the uncertainty issue are probably the two biggest reasons why science and health have become such a target of mis and disinformation. Stephanie: Yeah, when the Covid vaccine first came out, . I also really saw a lot of this, like people, like they’re just testing it on us. We’re lab rats, yada, yada. And part of it is that, that I saw is, you know, that has happened particularly among black and [00:23:00] indigenous people in our country. and I feel like I don’t, I feel like sometimes health and science hasn’t really, I think there’s a, there’s a group of us that really know about this and, and what we’ve done overcome some of those issues. But I don’t think the general public knows that. So, you know, there is a level of mistrust, especially in certain communities on, um, scientists and, and physicians and other healthcare people. Sara Gorman: I agree. And this this is a lot of what my book is about, that trust, my new book is about the trust issue. and I have a chapter on really focusing on black and Latinx populations in the US and the complex history with things like the Tuskegee syphilis experiment, but then also things that happen every day with bias and systemic discrimination and racism that are happening at the level of the healthcare system. So it’s very hard to say in those situations that you should just trust what the scientist told you. that’s [00:24:00] not a sufficient strategy because there’s too much really not just mistrust, but there’s too much of an issue with the way that the healthcare system has treated people. And I would also say that a lot of my book argues. that issues with access to healthcare in any population have created a lot of mistrust. And so in some ways, rather than focusing on why are individual people against the healthcare system, the question is really why has the system pushed people into a situation where they don’t trust it? And that’s a very different problem that requires a different set of solutions, Stephanie: Love that. But I Nicole: Girl. Yes. Sara Gorman: Oh good. I’m glad you like that because that’s my Stephanie: Yeah, I know. We love that. Nicole: I feel like we could just end the podcast now. I mean, like, there’s so much information that has happened and then you just like ended with the most Perfect question. I mean, like that’s,[00:25:00] Sara Gorman: You could put it at the end. Nicole: I don’t Yeah. I, I will, uh, for sure be checking out your book and, you know, and looping into with what y’all were talking about. Like, I didn’t read Medical Apartheid till after I had my PhD and after reading that book, you know, when, when you’re getting your PhD and we had to take these like principles of scholarly integrity, you know, like how to not be a terrible researcher and do terrible things to people and so many focus on tuske as like the, the study, you know, and. in my mind I was like, well, that was like the worst one and that must be like the big one. And then you read Medical Apartheid and that is merely like three pages in this I don’t know, you know, 200 plus page book, whatever it was. And then, and I had no idea of like, like again to see such a blip in, in the terrible history of things researchers have done. And my, my new like soapbox is [00:26:00] anybody in med in Medicine health should read that book and , so, so you can understand. And because it’s so much more than Tuskegee and it’s, and and it’s what you said. what ways has, is the system responsible and wow, that was amazing. Thank you Sarah Sara Gorman: Of course, Stephanie: Yeah. So let’s talk about your first book. so we would love to hear more about why we, as humans ignore the facts that save us. Could you talk about that? Sara Gorman: there are several key reasons why we have that psychological predisposition to basically ignore a lot of the evidence. And I go through about six of them in the book and each chapter, it’s another psychological or social phenomenon that somewhat explains why we ignore the evidence. And I think it really comes down to basically a few things. One is that our risk perception is [00:27:00] not quote unquote accurate. We don’t perceive risk the way that scientists study it. So when we are communicated to about the risk, it’s often in a way that we can’t conceptualize or that doesn’t feel comfortable to our psychology. And it can be very subtle, but it makes a huge difference in the, in the decisions that people ultimately make. So, for example, we have this thing called the availability bias. And if we think something is, is very available and vivid in our minds, then it seems it might seem riskier than something that we don’t have the ability to picture. We don’t think about as much. So the good example of this is why are we more afraid of airplane crashes and car crashes? Even though airplane crashes are much less common, the reason is because plane crashes get written up in the news and every, they’re very dramatic and people have a mental model for what that looks like. And that makes [00:28:00] it seem like it’s more common because you can, it’s very vivid in your brain and your imagination can really see it. And that creates more risk perception versus something that you know is a problem, but you tend not to see in the news and you don’t think about it too much. So that’s an example where your, your biases can really skew your risk perception. And when it comes to making a decision about, say, treatment or getting a vaccine, the same thing happens. People might have read a story in the news where somebody was injured by a vaccine. and they can’t conceptualize that that was, that’s not common. And most people just get the vaccine and are fine. So that’s a really big problem and that’s just one of the biases that affect risk perception. Another area is confirmation bias. So you probably heard a lot about this, but this is a little bit what I was talking about earlier where when I said we can’t change our minds,[00:29:00] and there’s actually neuroscience to back this up where we really, the fear centers of our brains are activated when we’re faced with information that goes against what we believe and the pleasure centers of our brain are activated when we see information that goes with what we believe. So there’s a really strong incentive. To push away what you don’t believe in, which is basically what you do with confirmation bias. You just look for information that confirms what you already believe. And I would also say it’s important to know that the, the fear centers of your brain actively suppress your prefrontal cortex, which is where you can do complex reasoning. So you can’t even take in new information when you’re having that fear reaction and actually try to a, adjust your viewpoint because your brain doesn’t even let you do that. So that’s confirmation bias. And then I would say there are a few really strong social factors as well, because this stuff doesn’t happen in a vacuum.[00:30:00] So there are leaders out there, some of the same people who kind of pedal the disinformation, who create groups. Of followers around certain health beliefs. Andrew Wakefield would be one of them actually, who said that the MMR vaccine causes autism. He’s somebody who’s become a very charismatic leader and has a following. And whenever people join a group, they are very incentivized to stay in the group. And so they’ll push away any kind of data or evidence that threatens the group membership and the group me. If the group membership is predicated on a belief in a non-evidence based idea, then that’s going to take precedence over everything else. So I would say social factors like groups and charismatic leaders. Risk perception is at the heart of everything, confirmation bias. And then the other factors I mentioned earlier around the way that our brains don’t really gel with the way science proceeds. Nicole: I feel like my brain’s just [00:31:00] like trying to digest this. Like, I love neuroscience stuff and like, you know, the fact that it suppresses your prefrontal cortex, so it’s like your brain doesn’t even allow you to do like, that’s all. I just, Sara Gorman: crazy. Stephanie: That’s why arguing with people on Twitter doesn’t work. Sara Gorman: Yeah, no, that’s literally why like they can’t even process what you’re saying. Stephanie: right? Yeah. And they Sara Gorman: Yeah. Nicole: Think about if we had this conversation a long time ago so that people, like a, an earlier, you know, maybe two years ago-ish and, and to be able to be like, it doesn’t work and like how much energy we could save . Like if we just knew that it, it, doesn’t work. Sara Gorman: you gotta look elsewhere. Yeah. Nicole: Oh, amazing. Okay, so I’m sure many of our listeners can easily discuss misinformation that they have heard in regard to the Covid 19 pandemic. But what are some areas within women’s health that you have seen misinformation be particularly [00:32:00] detrimental? Sara Gorman: So there are a lot of areas around pregnancy that are where people just have weird, I would say ideas. So they’re not necessarily harmful, but people believe certain things that would help their fertility, where there’s really no evidence basis for that. And I think there’s, it’s a very anxiety provoking time. So it’s a time when women are prone to being victim to some of the misinformation. I think some of the more harmful examples that I’ve seen there is sort of a movement. that I’ve noticed against inducing childbirth. And I think part of it comes from the fact that that method is probably used too much. But there are obviously indications for doing that. And there are many tragic sort of stories I’ve seen in the newspaper about somebody who’s baby died because they refused to be induced when they were like 43 weeks pregnant. [00:33:00] and so in those cases, I would say that that that is harmful. There’s misinformation circulating about what induction really is and the risks associated with it that cause people to make those kinds of decisions that are harmful. I would say the other area that’s very harmful right now is around abortion, and a lot of that is intentional by people who have ideologies that are against abortion. But there are a lot of different myths about abortion being, bad for your health and dangerous in some way, and c potentially causing things like cancer down the line. And then there’s also a lot of misinformation around, being able to reverse abortions, which I think is, can be dangerous because it can introduce people to dangerous methods of ways to do that that might not be medically safe. So I think those are some of the biggest areas I’ve seen. There are definitely other myths. There are a ton of [00:34:00] myths around, like I said, around pregnancy, around menstruation, around even around menopause. And anything you can think of. There are, there’s a lot of misinformation. There’s also just general confusion in the reproductive health area because I would say, , it’s very complicated and there is a lot we don’t understand. For example, there’s a lot we don’t understand about fertility, so it’s not that surprising that people have made up basically ideas about how to improve your fertility, which probably don’t harm very many people, but aren’t, most of them are not really evidence-based from what we know either. So I, you know, I would say reproductive health is kind of a special area because A, it’s complicated. B, there are political ideologies around parts of it, and c, it involves people who can become pregnant. And there are societal, again, that’s a somewhat, can be a marginalized group in [00:35:00] society. There are societal issues that people have around women. And so I think it becomes a very vulnerable area, for those reasons as well. Nicole: Well, I know part of why Stephanie and I have this podcast where we talk really about communication is because so much, and maybe in some ways all of what we know about sexual reproductive health is, is communicated to us or needs to be communicated to us because we don’t wake up knowing. These things. We don’t wake up knowing this is my period and it’s gonna come every 28 days, and this is how I would prevent a pregnancy. And this is, you know, all this stuff is learned like that all needs to be learned and communicated. And so I could see where this would be an area ripe for misinformation, disinformation, because we need to be, it needs to be communicated. And when you get in all these layers of poli politics, ideologies, you know, shame, stigma, all these things surrounding this, yeah, it definitely is an [00:36:00] area that could be really detrimental. That’s what makes sense. Sara Gorman: I would say we also have probably done a disservice to people by not teaching more about it in school. And that’s another place where political ideology has taken over the scientific realm. I write a lot in denying to the grave and actually am doing some work with various education departments around science education and reforming science education because there are a lot of ways in which we don’t teach children about how science really works, and the uncertainties involved in it and what is scientific consensus and really understanding how to interpret scientific information because everyone needs to know how to do that. And sometimes when we focus on teaching adults how to do that, I think that’s harder. And I think we’re sometimes missing the Mark A. Little bit and we need to start when people are young. I think it’s the same thing like you said, with sexual and reproductive health. If people are learning about this from a young age, They may appro, they may even approach some of the [00:37:00] political issues differently if they have a better understanding of what this really is and how it really works. But right now, people are kind of in the dark, which I think is, is contributing to the situation we’re in. Stephanie: Yeah. And I think with that political, we have a, this issue of shame. With sexual and reproductive health that we don’t necessarily have as much of, with like infectious, I mean, some infectious diseases. I think some the sexual and reproductive health infectious diseases have some more shame with them. But, so there’s the shame, like, we’re not gonna talk about it. And then it really doesn’t lend itself well then when you do talk about it, like people, you know, like people’s motives and, and like Nicole said, like you have to learn about sexual and reproductive health from someone. And if we’re sort of all going around not knowing what we’re doing and we’re, [00:38:00] shameful about it, it’s just like a recipe for nobody’s talking about it. Nobody knows what’s going on. Stephanie: So, as you know, our podcast is all about communication, and we love to give our listeners some tips and tricks related to communication. So from your research and from your perspective, how can clinicians manage misinformation when speaking with patients? Sara Gorman: We have an argument that we believe that clinicians should actually screen people for about their information environments, the way they do for other social determinants of health. So understanding where people get their information from and , who they tend to trust and not trust. I think having more of a system around asking those questions, even formal questions that clinicians could ask in a regular checkup. And then actually having some sources that are good for different communities or that are good for, depending on what the person’s answer is to I trust, I tend to trust government sources or I tend not to trust government [00:39:00] sources. And then you might give them something like, dear Pandemic. So really tailoring the information you give them based on their comfort level with different kinds of sources and understanding how they get their information. I think that’s part of the, we will just help us collect more information in general about. How people are absorbing what they’re learning about health in terms of actually having a conversation with somebody about misinformation or about something they say that is not evidence-based. That’s harder because it usually takes time. And that’s one thing that clinicians don’t have is time. but I do think that there can be more training around how to have a, at least a short conversation with somebody that’s not just spitting back the facts at them, but that’s asking a few questions about, okay, where did you come up with this? How did you decide that you believe this? And is there anything else that I, information or sources that I could give you that might. Make you think about, at least reopen your mind to thinking about this issue. [00:40:00] I also think that there’s an opportunity to intervene in waiting rooms. I think people are just sitting there checking, playing games on their phone. And if we could actually engage people in those settings to consider, to start thinking about what do I think about this? What is my information environment like? How did I decide, you know, what are my feelings about getting X, Y, and Z vaccine? Rather than just walking in and the doctor saying, okay, you’re gonna get this vaccine. And you know, the person has a split second reaction to that. Yes or no? So I think some of you know, there are opportunities in the clinical areas to make more inroads. I do think we, we probably need to ultimately reimburse if clinicians to spend more time having longer conversations with people when there are misinformed beliefs. But the ideology behind what I advocate for those conversations is, Definitely empathy and meeting people where they are and trying to [00:41:00] establish common ground. So as much as you can understand their motivations behind what they believe, what are they trying to accomplish with their health in general, or their family’s health, and trying to bring them back to those values, when you ultimately make a recommendation about what they should do, because chances are you’re somewhat aligned on what the value is. I wanna keep my children healthy, et cetera. Uh, the issue is that you’re disagreeing about the way to get there. So bringing people to that, drawing out whether they have some ambivalence about believing misinformation, which most people do actually have some ambivalence and are not totally sure. Drawing that out and helping them, you know, just come out. being in that mindset a little bit. I would say one other thing you should do is evaluate how entrenched the person is, because sometimes when the person is totally entrenched, it might, you know, it might not really go anywhere and that’s unfortunate, [00:42:00] but you can’t expect people to, you know, especially a busy clinician, to always intervene in those situations. So some idea about where people are on the spectrum is probably helpful too. Nicole: So there’s a few things I’d like to unpack there. one, I’m gonna go back to the waiting room. So are you thinking like there would be posters and if so, posters, what kinds of things would they say? Sara Gorman: So this is a relatively new idea. I actually saw a study that, that did something like this in, in emergency department waiting areas, which are mu usually much longer waits than waiting for the pediatrician or something, but, where people actually watch short videos where there was a little bit of a dialogue about are you feeling a little bit unsure about a covid vaccine? You might have heard this, you might have heard that. And then sort of talking about what are some of the responses to that. That just gives a person, that primes the person to then be usually to be a little bit more [00:43:00] open-minded when they go in and have a conversation with the doctor. Or when the doctor asked ’em, would you like your covid vaccine? And that was actually somewhat effective. They also matched, sort of background, so if it was a black person, they would see a black clinician on the screen. And so I think that’s sort of along the lines because it can’t be too tailored to individual people when it’s just posters would probably be posters in the waiting room or handouts that people would see when they sit down. that would be to get people to just take a step back and consider what they think. And here are some of the other ways to think about this. And just creating a reflective space. Because when you get people in that reflective space, it is easier to change minds. Versus this what tends to happen, which is the rapid fire effects back to them, which doesn’t open up their minds at all usually. Nicole: So what are some specific questions you could ask to [00:44:00] get that reflective space or the specific questions you could ask to gauge where someone is maybe on their hesitancy scale. Like what are some like actual questions you would recommend? A clinician asking. Sara Gorman: Well, you first wanna ask. , you might wanna actually ask on a scale from, you know, one to five or something, how, how likely do you think you would be to wanna get the vaccine today? Let them answer. That gives you some sense that is better than asking yes, no or unsure because it just gives you more information. And then what you can do is ask the person, how did you first decide that you didn’t, didn’t think you wanted to get the vaccine? Because a lot of times people rush breeze right through how they come up with their ideas and never realize that, oh, I was listening to this thing but I was really distracted and I, now I’m not so sure that would be the ideal case scenario. That doesn’t always happen, but in some cases it does happen where people say, oh, wait a minute. You know, they realize they’re more unsure than they thought, in part [00:45:00] because the origin of their belief is a little shaky. So you get them to think a little bit about the, the path trace, the path that they’re on, and you might ask them, you know, what are your goals? With your health or your family’s health. Why did you come in today? What’s your goals with their health today? And then that’s where there could be a little bit more talking on the part of the clinician to say, you know, X, Y, and Z about the vaccine. It is okay to reinforce some facts about the safety and efficacy. as long as you, it doesn’t feel like I wouldn’t open with that. And I, and I definitely don’t want that to be dominant in the conversation, but it can be part of the conversation for sure. And so those are some of the things that, that you might want to do. Ask them and then get a gauge at the end of having that short conversation. How are you feeling about it now? Do you feel any closer to wanting to do it? Do you want to do it? What else would it take? If people say, I’m waiting, what would it take for you to feel comfortable? What kind of study would you need to see? Or that’s another way into people realizing their [00:46:00] ambivalence because they, they realize like, oh, I don’t know. Maybe, maybe this is all the evidence we’re gonna get, so I need to, I should probably think about what I want to do. Stephanie: I was also thinking, Nicole and I gave a, and I know, so you know this Sarah, but Nicole and I were asked to give a talk to other nurses about, how to combat misinformation within a clinical appointment. And one of the things that we kind of found in different resources, I wouldn’t say in evidence, but was like not repeating the misinformation. I don’t, I feel like that had a term, but I’m blanking on it. Nicole: The backfire Stephanie: Thank you, , the backfire effect. Do you, could you talk at all about that, Sarah? Sara Gorman: Mm-hmm. Stephanie: effect. Sara Gorman: the backfire effect actually refers to some research that was done by a political sciences scientist, actually Brendan Han, who found [00:47:00] that when you corrected people, that they would kind of get more entrenched in their, in their false beliefs. So you had to be careful about how you corrected people, whether you corrected people. Everyone was very nervous after this finding. It turns out that it’s wasn’t so easy to replicate that finding, and so most people are aware of the backfire effect. It’s also called psychological reactants, but, and they’re careful about it, but it’s not in the field. It’s not something that you, that people are so nervous about because it doesn’t seem to happen that consistently. There’s still, the jury’s still out about what are the conditions under which it actually happens, but it is reasonable to see that people will. become, if they’re threatened, they’ll become more believing in in the opposite viewpoint. I think the other issue that you’re bringing up is, is what do you do if you need to repeat misinformation?[00:48:00] And I think there isn’t a clear evidence-based answer about how much repetition is safe or dangerous. It’s definitely true that repetition reinforces ideas. So part of the reason misinformation is so much more powerful than the correct information is because the misinformation, people who wanna spread it, repeat it over and over and over and over again, whereas this health authorities don’t do that with the correct information. And so it kind of gets lost. So you do have to be careful. I often tell people you can repeat some of it once you wanna do it in a very non-descript, not exciting manner. that you don’t want to add to the drama or anything of it. And just if you need to, you don’t have to, you, you know, it’s better if you don’t, but if you need to repeat it to make a point or to refute it, you can mention it once, but try not to start or end with that and [00:49:00] like I said, try to make it be sort of unexciting. Nicole: I don’t know if we wanna go down this rabbit hole, but I’m gonna, I’m gonna ask it anyways. . I think a lot of us got caught in this esp, you know, thinking pandemic wise now, on Twitter, on Facebook, getting in arguments with people and trying to say, no, this isn’t the case. Is there, have you found, with, with all of your research, like is there an effective way to engage with people in the online space or, You just like don’t do it like it , you know? I know we had talked about how before with your prefrontal cortex and all that, but like is there a way to even even say, you know, I think your pandemic does this, or even personally, like if I wanted to put some information out and be like, Hey, like just want you folks to know like this, is there a way to craft that in a way that is receptive to. Sara Gorman: That’s a good question. [00:50:00] I would mention that most of our work on Critica is focused on the online environment and utilizing interventions to combat misinformation. So we do actually go into comment feeds on popular articles where there is misinformation and we train people in our protocol, which I can describe a little bit to respond to people. And we’ve had success. We have actually had people tell us that. They have a change in attitude or even behavioral intentions have changed in some cases, and we’ve definitely diffused people who were getting worked up. So that, I definitely count that as a success as well. And we’re still analyzing the data from some of our interventions, but we train people to, it’s a lot of what I talked about earlier when I mentioned the clinician conversation. So empathy being sort of the key factor. And we train people in basically in a strategy called motivational interviewing, which was developed really in the substance abuse [00:51:00] area and have been adapted for many different behavior change techniques and health. And basically what it dictates is that you be ver very empathic that you try to establish common ground with the person, which can sometimes come from telling them, you know, if it’s true, I was a little unsure about the vaccine to begin with, or I had questions too, and this is what I learned. That sort of thing. and drawing out their ambivalence. So where are there places where they might feel like, I’m not sure what I believe, or I, maybe there’s part of me that wants to get the vaccine. and also focusing on those values again, what are you actually trying to do with these beliefs? What, what purpose is this serving for your life? And through that process we get, you know, people will ask a lot of questions to get a sense of where this belief came from. And part of what we’re doing is not just inter interacting with the person [00:52:00] who said something that was misinformed, but all the people who might be reading that chain are also being impacted by what they’re seeing us say. And we do provide facts, we do provide the correct information like we were just talking about, but the focus is really more not on just. Writing the wrongs, but really on understanding the, the psychology and the motivations that are going into these beliefs and how we might be able to move them. Nicole: Oh, thank you for that. Stephanie: That sounds super interesting. Nicole: All right. So coming back out of the rabbit hole, what teasers can you give us about what you are finding as you research your current book, examining the relationship between healthcare access and mistrust, and the rise of conspiracy theories in the medical. Sara Gorman: the main argument of the book is that access to healthcare and the social safety net in this country, the inadequacies of those things have pushed people to mistrust [00:53:00] in health and public health and toward conspiracy theories. So there is a actual underlying structural issue, not just individual psychology or group dynamics that are creating this crisis of trust and conspiracy theories that we have. I also did a ton of in-depth interviews with people who told me, who I self-identified as having family members who had high levels of mistrust and or conspiracy theories about health. And there are a number of themes that came out of that. I won’t spoil them all, but one of the things that I will say, because it, it. Can at times be a little bit depressing. What I’m describing in the book is that people are doing interventions with their family members that are really quite remarkable. The, it doesn’t always work with the people that we’re talking about. And that peop, there were some tragic stories I heard about people dying, family members dying really largely in part because they wouldn’t accept medical treatment [00:54:00] or the evidence around certain health interventions. But on the, you know, on the upside, there are people who are telling me that they’re constantly trying new things and some of them have had success at times. And one of the other things that I’ve noticed is that even among some of these people who their family members would describe as entrenched in their viewpoints and very extreme, when I got more into questioning, there were always moments that people would talk about where. Their family member would soften on their ideas or they would start to see, come around a little bit to what the, the other person was saying. So there’s actually a lot of hope there. I think that we need to make these stories more visible. That’s in part what I’m trying to do, and we need to make systematically test some of these things that people are trying in informal ways. But there is definitely a message of hope there, which is that people have good instincts about how they can intervene on some of these [00:55:00] problems. And there are moments where it works, even if it doesn’t totally change the outcome, Nicole: Stephanie, I don’t know if you read this book, but you know, I feel Sara Gorman: But Nicole: there’s a lot of things you’ve been saying through this whole conversation. the book, think Again by Adam Grant. have you read that one, Sarah? Sara Gorman: I probably like Nicole: you probably have, cuz it seems like it would be very much in your wheelhouse. and one of the theories he talks about too, that, you know, maybe you saw this in your, with your research is contact and like, you know, having that repeated contact and exposure and like how that can make a difference. And, so I wonder if, you know, that’s kind of what you’re talking to with these people who like had these moments of softness. Was it because of like that repeated contact, repeated exposure? Sara Gorman: I think part of it, I mean, people sometimes would give up, but I would also encourage them, you know, I say, it sounds like you made some progress at times. Like you, you know, and there’s a tremendous amount of resilience here where people do keep trying because it’s their family member and they care. And I [00:56:00] think what happens is, you know, part of it is the messenger is your family member, so that’s easier for people to swallow. And some of it is just, it’s ex, I think it’s exhausting to hold on to some of these beliefs. for so long, and all of these people had risk factors, a lot of them were very isolated. A lot of them had lost their jobs. , which was a, which is actually a key factor in this, but I would say that, you know, the, they would have these persistent family members and like I said, the outcome wasn’t always great, but there’s definitely evidence there that things are, that these things are doing something. And I think that’s important because most people I talk to about this are very frustrated, just in general, are very frustrated about their, and unsure of their ability to make a difference with anyone ever. And so they just stop trying and that I think is what we need to try to Stephanie: Yeah. It’s really sad when you think about the devastating consequences that misinformation or [00:57:00] disinformation can lead to. I mean, and that’s. exactly what you’re saying in your first book, like you’re denying to the grave. Sara Gorman: Yeah. Stephanie: I’m just like, I’m shocked too, just in the people who leave their jobs, because they believe some sort of misinformation. so to, yeah, to have like a near death or d you know, death experience is really shocking to me and sad. so we always ask, this of our guests, but what is one thing that you would like all of our listeners to know about your topic or in this case, it’s information. Sara Gorman: Can I say two things? Nicole: I was gonna say, or top three. I feel like it always is more than once, so that’s totally okay. Sara Gorman: I think one thing I would say for sure is that misinformation has real consequences. So in any area, but especially in health, and I think this is hard for people to actually conceptualize sometimes, even though it might sound obvious, but there are [00:58:00] studies that show that people who share or attend to misinformation more are less likely to do things like mask, get vaccines, social distance For covid, there are real behavioral outcomes of engaging with misinformation. So it’s not just something that happens in conversation or online. It has real consequences in terms of people’s actual livelihood and that’s very important. The other thing I would say is that it’s important to recognize that this is a multifaceted problem. So it’s not just about individual people needing to get better news literacy or, or have a better sense of whether a, a source is credible. It’s an extremely complex. Systemic issue that involves regulation of social media by the government, the social safety net. That’s many, many ways inadequate in our country. Ensuring better access to healthcare, dealing with some of the economic problems we have in our country. [00:59:00] Understanding how doctors can deal with people who are isolated or actually help people who are very isolated. And like I said before, it’s not just what am I consuming at this moment, it’s what is my information environment? So what kinds of sources do I have access to? Some things you have to pay for. So that’s something to think about. Not everyone has access to those sources. And is there a, do I have access to local media? There are many places in the country where there are media deserts where people just have nothing and they’re really just focusing on online communications to get information. So, and then of course there are people who are victimized by disinformation. More than others. So I would say that just understanding this entire environment and the systemic issues and how complex misinformation really is, it’s not just about the incorrect information that’s gets spread. It’s about all of the factors that allow it to be taken up by people, allow it to be [01:00:00] spread and allow it to have an influence on people’s behaviors and their health. Nicole: All right, so Sarah, where can folks go to learn more about misinformation and how to combat misinformation? Sara Gorman: There are a number of good sources that I like that either publish things on their blogs about misinformation or just have good research going on that you could read about, about misinformation. So one is called Media Well, which is a product of the Social Science Research Council. It’s about a number of different things around sort of media issues, but there’s a big heavy focus on misinformation and they have expert commentaries. And so it allows you to sort of see what people in the field are thinking about and dealing with. there’s the Shorenstein Center. You could just check out their website. They also have helpful information about misinformation and technology. . There’s actually another podcast called the Big Tech Podcast that isn’t always about information or misinformation. It, it’s about [01:01:00] technology. But because so many of these issues are embedded in the technology issues, this person who makes this podcast just happens to talk about misinformation a lot. So it’s a good place to just check out to see if there’s anything on the topic. And then I would also say there’s a scholar called Claire Wardle, w a r d l e, who has a lab at Brown called the Information Futures Lab. And there’s just all kinds of exciting projects going on there that you could read about, that are about, disinformation misinformation. What I’ve been talking about with the information environment and her team is doing is really on the cutting edge of some of those things as well. Nicole: And can you share with our listeners where they can find Critica and maybe how to spell that? I don’t know if there’s any ways you spelled that . Um, but how, how they can find you and the work you’re doing. Sara Gorman: Our website is cria science.org, so that’s C R I T I C A science.org. And you can also find [01:02:00] us on Twitter at critica underscore life. And we’re also on Facebook, and you can sign up on our website to receive our commentaries and our newsletter right in your inbox. Stephanie: Sarah, I would personally like to thank you so much for your time and commitment to advancing sexual and reproductive healthcare through communication. Do you have any last thoughts that you would like to add before we end? Sara Gorman: I think the last thing I would say is that if you are someone out there who’s been dealing with misinformation on your own social media channels, there are a couple of things I would consider. One is don’t give up. because you never know. Not only are you potentially impacting the person you’re interacting with, but there might be other people who are reading what you write that might be affected by it and might make a better health decision because of it. So don’t give up. And the same token, take care of yourself. Make sure you’re not getting burned out or too upset about some of these interactions can be [01:03:00] difficult and look after yourself first, but your time and, and your work on this is always valuable. So keep going. Stephanie: That’s perfect. Nicole: Yeah, I appreciate you looping that in cuz usually we do ask a question about, you know, if you’re not a clinician, but I feel like we kinda cover a little bit of both. So thank you for adding that extra piece there. Thank you so much, Sarah. Stephanie: Yeah. Thank you.