Research by psychologists and others has already pinned down several strategies that effectively counter health misinformation, including debunking false information by providing a correction and explaining why the information is false, or inoculating people against misleading claims by teaching strong critical-thinking skills. Another effective strategy is leveraging trusted sources such as religious leaders or athletes to communicate with the public about health.
Those insights are being increasingly applied, including by the CDC, as it aims to regain widespread public trust following the Covid-19 pandemic. In 2022, one study found that 54% of U.S. adults said they trust their doctors a great deal, but only 37% said the same about recommendations from the CDC (Pollard, M. S., & Davis, L. M., Rand Health Quarterly, Vol. 9, No. 3, 2022).
The CDC’s new approach involves using AI to monitor viral online health misinformation on topics such as vaccines, emergencies, and drug overdoses. Then, the agency aims to debunk or prebunk incorrect content as quickly as possible, offering alternative explanations when available.
In one very public example, when rapper Nicki Minaj tweeted that her cousin’s friend became impotent after getting a Covid-19 vaccine, triggering protests outside the CDC, the agency responded quickly on social media with a fact-check about the vaccine and fertility. During respiratory virus season, the CDC now releases weekly social media videos where CDC Director Mandy K. Cohen, MD, MPH, addresses the public’s top concerns in an approachable way.

Preliminary findings from several research groups point to other strategies that may help people resist health misinformation. Learning-based interventions seek to arm people with skills or knowledge that make them less likely to believe false information. Swire-Thompson reports testing whether discrediting a source, such as by highlighting a conflict of interest or a track record of inaccuracy, could help correct cancer and vaccine misinformation; her findings have not yet been peer-reviewed (PsyArXiv Preprints, 2024).
“We found that discrediting a source helped people resist misinformation, but track-record discreditations were less effective when the source was a media outlet, compared to a person,” she said.
Sampsel’s team at PEN America applies some of the same insights at the community level. They fund counter-disinformation efforts in Miami, Dallas–Fort Worth, and Phoenix. They also run media literacy webinars on topics such as how cognitive biases affect the way we process information online, and they provide public guidance on how to have productive conversations with family members, friends, and acquaintances about divisive topics such as public health and immigration.
Other psychological interventions aim to change online behavior—for example, reducing the amount of misinformation people share. Research led by psychologists Gordon Pennycook, PhD, of Cornell University, and David Rand, PhD, of the Massachusetts Institute of Technology, suggests that people share less misinformation when platforms nudge them to consider the accuracy of their posts, but the group’s latest findings have not yet been peer-reviewed (working paper, 2024).
Effect sizes are generally small, and some psychologists remain cautious of how applicable such findings will be. “I still haven’t seen work that reliably and over time changes behavior in exactly the intended manner—people sharing less unwanted content, more desired content, or both,” Roozenbeek said.
One intervention, the Spot the Troll Quiz, developed by psychologist Jeffrey Lees, PhD, of Princeton University, and his colleagues, did reduce participants’ total retweets on all topics for about a week and helped people learn to identify fake accounts (PNAS Nexus, Vol. 2, No. 4, 2023).
Some research even suggests that correcting health misinformation won’t change health behavior. Psychologist Dolores Albarracín, PhD, a Penn Integrates Knowledge Professor and director of the communication science division at the University of Pennsylvania’s Annenberg Public Policy Center, suggests focusing instead on interventions that do reliably change behavior, such as increasing access to health care.
“Misinformation is very salient, so people think it matters a great deal, but the impact of interventions in the information space is negligible,” said Albarracín, also a contributing author on APA’s consensus statement. In other words, time spent correcting misinformation might be better spent elsewhere, such as increasing health care access.
In a review of meta-analyses on behavior change interventions, including those targeting health behaviors, Albarracín and her colleagues found that interventions that provide general knowledge or skills training—a category that includes many psychological efforts to correct misinformation—had little to no effect on behavior.
On the other hand, approaches that improved access to care (for example, by providing free vaccines or walk-in appointments) had the largest effects on behavior. Interventions that helped people build habits (performing and automating a behavior, such as dental cleanings) or provided social support (such as pairing people up to get vaccinated together) had medium effects (Nature Reviews Psychology, 2024).
“At this point, we have enough information to say: Yes, correcting misinformation is important, because people are entitled to the truth. But in order to change behavior, do something else,” Albarracín said.
This article was made by www.apa.org by
Source link