Health

Meta Moves To Back Off Removing Covid Misinformation From Platforms

man with face mask in home quarantine lockdown checking pandemic news

Meta took a step Tuesday toward abandoning its policy of removing misinformation about Covid from its platforms.

The company, which owns Facebook and Instagram, is asking its Oversight Board for an advisory opinion on whether measures taken to squash dangerous Covid-19 misinformation should continue or be modified.

In an online posting, Meta’s president for global affairs, Nick Clegg explained that the company’s harmful information policies were expanded at the beginning of the pandemic in 2020 to remove entire categories of false claims on a worldwide scale. Prior to that time, content was removed from Meta’s platforms only if it contributed to a risk of imminent physical harm.

“As a result,” Clegg wrote, “Meta has removed Covid-19 misinformation on an unprecedented scale. Globally, more than 25 million pieces of content have been removed since the start of the pandemic.”

However, Meta is suggesting it may be time for a change in its Covid misinformation policy.

“We are requesting an advisory opinion from the Oversight Board on whether Meta’s current measures to address Covid-19 misinformation under our harmful health misinformation policy continue to be appropriate or whether we should address this misinformation through other means, like labeling or demoting it either directly or through our third-party fact-checking program,” Clegg noted.

Fading Emergency

Meta’s Covid misinformation policies were adopted during a state of emergency that demanded drastic measures, explained Will Duffield, a policy analyst with the Cato Institute, a Washington, D.C. think tank whose vice president, John Samples, is on the Oversight Board. “Now, three years later, the sense of emergency has faded,” he told TechNewsWorld.

“There’s a lot more health information out there,” he said. “If people believe ridiculous things about vaccines or the efficacy of certain cures, that’s more on them now and less a result of a mixed-up information environment where people don’t know what’s true yet.”

“It was an unprecedented step to hand the policy over to global health organizations and local health authorities,” he added. “At some point, some of that had to be clawed back. You can’t have a state of emergency that lasts forever so this is an attempt to begin unwinding the process.”

Global Repercussions

Is the unwinding process beginning too soon?

“In the developed world, vaccinations are almost universal. As a result, while caseloads remain high, the number of serious illnesses and deaths are quite low,” noted Dan Kennedy, a professor of journalism at Northeastern University in Boston.

“But in the rest of the world, where there are countries where Facebook is a bigger deal than it is in the U.S., the emergency isn’t close to being over,” he told TechNewsWorld.

“While many countries are taking steps to return to a more normal life, that doesn’t mean the pandemic is over,” added Beth Hoffman, a postdoctoral researcher at the University of Pittsburgh’s school of public health’s department of behavioral and community health sciences.

“A big concern is that removing the current policy will particularly harm areas of the globe with lower vaccination rates and fewer resources to respond to a surge in cases or new variants,” she told TechNewsWorld.

Clegg acknowledged the global ramifications of any policy changes Meta might make. “It is important that any policy Meta implements be appropriate for the full range of circumstances countries find themselves in,” he wrote.

Line in the Sand

Meta wants to draw a line in the sand, maintained Karen Kovacs North, director of the Annenberg Program on Online Communities at the University of Southern California. “Their point is that there is no imminent physical harm in the same way there was at the beginning of the pandemic,” she told TechNewsWorld.

“They don’t want to set a precedent for taking stringent action if there is no imminent physical harm,” she added.

Clegg noted in his posting that Meta is fundamentally committed to free expression and believes its apps are an important way for people to make their voices heard.

“But resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic,” he continued.

“That’s why we are seeking the advice of the Oversight Board in this case,” he wrote. “Its guidance will also help us respond to future public health emergencies.”

Meta says it wants to balance free speech with the spread of misinformation so it makes sense that it would revisit its Covid policy, asserted Mike Horning, an associate professor of multimedia journalism at Virginia Tech University.

“While they seem to remain concerned about misinformation, it’s also good to see that they are concerned with how the policy might impact free speech,” he told TechNewsWorld.

Backlash From Content Removal

Pulling back on removing Covid misinformation could improve Meta’s image among some of its users, noted Horning. “The removal policy can be effective in slowing the spread of misinformation, but it also can create new problems,” he said.

“When people have their posts taken down, more conspiracy-minded individuals see that as confirmation that Meta is trying to suppress certain information,” he continued. “So while removing content can limit the number of people who see misinformation, it also leads some to see the company as unfair or biased.”

The effectiveness of removing Covid misinformation may also be passing its expiration date. “One study found that when the Covid misinformation controls were first implemented, distribution of misinformation was reduced by 30%,” Duffield said.

“Over time, misinformation peddlers shifted to talking about other conspiracy theories or found coded ways to talk about Covid and Covid skepticism,” he continued. “So initially, it had an impact, but that impact waned over time.”

North noted that some methods for controlling misinformation may appear to be weak but can be more effective than removing content. “Removing content can be like whack-a-mole. Content gets removed so people try to post it in a different way to trick the algorithm,” she explained.

“When you de-index it or reduce its exposure,” she continued, “it much harder for a poster to know how much exposure it’s getting so it can be very effective.”

Profiting Off Misinformation

While Meta declares the noblest of motives for changing its Covid misinformation policy, there could be some bottom-line concerns influencing the move, too.

“Content moderation is a burden for these companies,” observed Vincent Raynauld, an associate professor in the department of communication studies at Emerson College in Boston.

“Whenever you remove content from your platform, there’s a cost associated with that,” he told TechNewsWorld. “When you leave the content up, you’re likely to get more content creation and engagement with that content.”

“There are lots of studies that show misinformation tends to generate a lot of engagement, and for these companies, user engagement is money,” he said.

John P. Mello Jr.

John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

More by John P. Mello Jr.
More in Health

Technewsworld Channels