As Meta's Oversight Board considers the company’s content moderation policies and enforcement practices around election-related misinformation, Media Matters has submitted a public comment asking Meta to bolster and more robustly enforce its content moderation process during the 2024 election cycle.
Meta has a responsibility to protect users from false information about elections and voter fraud, but the company has repeatedly failed to stop election misinformation from proliferating on its platforms. Notably, Meta failed to stop election misinformation before, during, and after the 2020 election. It also failed to adequately enforce its ban on “Stop the Steal” content following the deadly insurrection on January 6, 2021, as the ban was narrow in scope. The company also failed to stop the spread of election-related misinformation during Brazil’s 2022 presidential election, which led to an attempted coup by far-right extremists.
As the board considers two cases involving Meta’s content decisions about Facebook posts that misrepresented information from the Australian Electoral Commission ahead of a national referendum -- and the broader harms of election misinformation on Meta’s platforms -- it should note that threats to democratic institutions and false information about the integrity of elections are growing international problems.
New data from Media Matters shows that election misinformation on Facebook remains a dangerous problem. From January 1, 2022, through February 12, 2024, right-leaning Facebook pages that regularly post about U.S. news and politics used election fraud-related keywords in at least 6,507 posts, collectively earning at least 6,626,042 total interactions.
The post with election fraud-related keywords that was posted from right-leaning pages and earned the most total interactions was from Daily Wire Editor Emeritus Ben Shapiro. In the post, Shapiro promoted election denier Dinesh D’Souza’s debunked election fraud “documentary” 2000 Mules. The post earned at least 124,000 likes and 10,000 comments, and it was shared at least 17,000 times. Facebook labeled the post as containing “partly false information.”
Read Media Matters' full public comment here: