Meta is allowing users to essentially opt out of part of its fact-checking program, which aims to “fight the spread of misinformation and provide people with more reliable information.” Previously, Meta automatically limited visibility of fact-checked content in feeds, but now, users can opt out of that misinformation-reduction measure.
This is the latest in a series of moderation-policy backslides from Meta, and other major social media companies, ahead of the 2024 election.
As noted in an August 25 report by The Washington Post, which highlighted how social media platforms are “surrendering to disinformation,” “Meta has quietly begun offering users new controls to opt out of the fact-checking program, allowing debunked posts … to spread in participants’ news-feeds with a warning label.” By default, Meta reduces visibility of content that has been fact-checked as false in users' feeds, but users can now change this default setting and choose a “don't reduce” false content option.
These recent changes in the company’s approach to content moderation indicate that Meta is taking a less aggressive approach to securing its platforms against misinformation ahead of the 2024 U.S. elections, even though Meta spokeswoman Erin McPike told The Washington Post that “protecting the U.S. 2024 elections is one of our top priorities, and our integrity efforts continue to lead the industry.”
Since June 2022, Meta has proposed rolling back its COVID-19 misinformation policies; ended former President Donald Trump’s suspension from its platforms, which he soon after used to post misinformation about his Manhattan indictment; and laid off employees focused on security, privacy, and integrity.
Meta Global Affairs President Nick Clegg has also said that he hopes that “over time we’ll have less of a discussion about what our big crude algorithmic choices are and more about whether you guys feel that the individual controls we’re giving you on Threads feel meaningful to you.”
Meta’s failure to prevent election lies from spreading on its platforms around the 2020 presidential election, and the role the company played in allowing its platforms to be used to help organize the January 6 insurrection, have been well-documented, as has its inability to control election misinformation in the years since.
After years of promises and claims of success around controlling the spread of misinformation, Meta is now seemingly capitulating to conservative complaints of censorship -- a narrative which has been debunked numerous times.