The government agency responsible for tracking down contaminated peanut butter and defective pacemakers is taking on a new health hazard: online misinformation.
It’s an unlikely role for the Food and Drug Administration, a sprawling, century-old bureaucracy that for decades directed most its communications toward doctors and corporations.
But FDA Commissioner Dr. Robert Califf has spent the last year warning that growing “distortions and half-truths” surrounding vaccines and other medical products are now “a leading cause of death in America.”
“Almost no one should be dying of COVID in the U.S. today,” Califf told The Associated Press, noting the government’s distribution of free vaccines and antiviral medications. “People who are denying themselves that opportunity are dying because they’re misinformed.”
Califf, who first led the agency under President Barack Obama, said the FDA could once rely on a few communication channels to reach Americans.
“We’re now in a 24/7 sea of information without a user guide for people out there in society,” Califf said. “So this requires us to change the way we communicate.”
The FDA’s answer? Short YouTube videos, long Twitter threads and other online postings debunking medical misinformation, including bogus COVID-19 remedies like ivermectin, the anti-parasite drug intended for farm animals. “Hold your horses y’all. Ivermectin may be trending, but it still isn’t authorized or approved to treat COVID-19” the FDA told its 500,000 Twitter followers in April.
On Instagram, FDA memes referencing Scooby-Doo and SpongeBob urge Americans to get boosted and ignore misinformation, alongside staid agency postings about the arrival of National Handwashing Awareness Week.
The AP asked more than a half-dozen health communication experts about the FDA’s fledgling effort. They said it mostly reflects the latest science on combating misinformation, but they also questioned whether it’s reaching enough people to have an impact—and whether separate FDA controversies are undercutting the agency’s credibility.
“The question I start with is, ‘Are you a trusted messenger or not?'” said Dr. Seema Yasmin, a Stanford University professor who studies medical misinformation and trains health officials in responding to it. “In the context of FDA, we can highlight multiple incidents which have damaged the credibility of the agency and deepened distrust of its scientific decisions.”
In the last two years the FDA has come under fire for its controversial approval of an unproven Alzheimer’s drug as well as its delayed response to a contaminated baby formula plant, which contributed to a national supply shortage.
Meanwhile, the agency’s approach to booster vaccinations has been criticized by some of its top vaccine scientists and advisers.
“It’s not fair, but it doesn’t take too many negative stories to unravel the public’s trust,” said Georgetown University’s Leticia Bode, who studies political communication and misinformation.
About a quarter of Americans said they have “a lot” of trust in the FDA’s handling of COVID-19, according to a survey conducted last year by University of Pennsylvania researchers, while less than half said they have “some trust.”
“The FDA’s word is still one of the most highly regarded pieces of information people want to see,” said Califf, who was confirmed to his second stint leading the FDA last February.
As commissioner he is trying to tackle a host of issues, including restructuring the agency’s food safety program and more aggressively deploying FDA scientists to explain vaccine decisions in the media.
The array of challenges before the FDA raises questions about the new focus on misinformation. And Califf acknowledges the limits of what his agency can accomplish.
“Anyone who thinks the government’s going to solve this problem alone is deluding themselves,” he said. “We need a vast network of knowledgeable people who devote part of their day to combating misinformation.”
Georgetown’s Bode said the agency is “moving in the right direction,” on misinformation, particularly its “Just a Minute” series of factchecking videos, which feature FDA’s vaccine chief Dr. Peter Marks succinctly addressing a single COVID-19 myth or topic.
But how many people are seeing them?
“FDA’s YouTube videos have a minuscule audience,” said Brandon Nyhan, who studies medical misinformation at Dartmouth College. The people watching FDA videos “are not the people we typically think about when we think about misinformation.”
Research by Nyhan and his colleagues suggests that fact-checking COVID-19 myths briefly dispels false beliefs, but the effects are “ephemeral.” Nyhan and other researchers noted the most trusted medical information source for most Americans is their doctor, not the government.
Even if the audience for FDA’s work is small, experts in online analytics say it may be having a bigger impact.
An FDA page dubbed “Rumor Control” debunks a long list of false claims about vaccines, such as that they contain pesticides. A Google search for “vaccines” and “pesticides” brings up the FDA’s response as a top response, because the search engine prioritizes credible websites.
“Because the FDA puts that information on its website, it will actually crowd out the misinformation from the top 10 or 20 Google results,” said David Lazer, a political and computer scientist at Northeastern University.
Perhaps the most promising approach to fighting misinformation is also the toughest to execute: introduce people to emerging misinformation and explain why it’s false before they encounter it elsewhere.
That technique, called “pre-bunking,” presents challenges for large government agencies.
“Is the FDA nimble enough to have a detection system for misinformation and then quickly put out pre-bunking information within hours or days?” Lazer asked.
Califf said the FDA tracks new misinformation trends online and quickly decides whether—and when—to intervene.
“Sometimes calling attention to an issue can make it worse,” he notes.
Other communication challenges are baked into how the FDA operates. For instance, the agency consults an independent panel of vaccine specialists on major decisions about COVID-19 shots, considered a key step in fostering trust in the process.
But some of those experts have disagreed on who should receive COVID-19 vaccine boosters or how strong the evidence is for their use, particularly among younger people.
The FDA then largely relies on news media to translate those debates and its final decisions, which are often laden with scientific jargon.
The result has been “utter confusion,” about the latest round of COVID-19 boosters, says Lawrence Gostin, a public health specialist at Georgetown.
“If you’re trying to counteract misinformation on social media your first job is to clarify, simplify and explain things in an understandable way to the lay public,” said Gostin. “I don’t think anyone could say that FDA has done a good job with that.”
© 2023 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.
Source: Read Full Article