A new report from The New York Times reveals that Facebook wants to limit journalists and researchers’ access to CrowdTangle, a data analytics tool that allows users to search public social media posts. Facebook’s concern over the tool -- and the data it provides researchers -- is just the latest example of the company prioritizing its public image over data transparency or solutions to the platform’s rampant misinformation problem.
In a July 14 article, New York Times technology columnist Kevin Roose detailed an “internal battle over data transparency” at Facebook, centering around CrowdTangle -- a search tool acquired by Facebook in 2016 that allows researchers to look at the engagement on public Facebook and Instagram posts. (The tool also offers Twitter and Reddit data.) CrowdTangle is one of the few tools available that allows researchers to peer behind the curtain of what’s gaining traction on Facebook. But much to Facebook’s likely chagrin, what those researchers have found is that right-leaning pages (and misinformation) consistently dominate on the platform. These repeated findings have not been good for Facebook’s public image.
The question of what to do about CrowdTangle has vexed some of Facebook’s top executives for months, according to interviews with more than a dozen current and former Facebook employees, as well as internal emails and posts.
…
[Facebook Vice President Brian] Boland, who oversaw CrowdTangle as well as other Facebook transparency efforts, said the tool fell out of favor with influential Facebook executives around the time of last year’s presidential election, when journalists and researchers used it to show that pro-Trump commentators were spreading misinformation and hyperpartisan commentary with stunning success.
“People were enthusiastic about the transparency CrowdTangle provided until it became a problem and created press cycles Facebook didn’t like,” he said. “Then, the tone at the executive level changed.”
Though the Times frames discussions about CrowdTangle as an internal debate, it seems likely any actions will be motivated primarily by a desire to protect the company’s public image — particularly given Facebook’s track record of prioritizing public relations over solutions
As Media Matters and others have documented, Facebook has repeatedly prioritized PR over transparency and avoided making real changes to address significant problems on the platform:
- Facebook’s algorithms are so opaque that their impact -- and the impact of subsequent changes to them -- can’t be quantitatively evaluated by outside researchers. In fact, after Facebook touted a tweak to its News Feed algorithm in 2018 to prioritize “time well spent” and “meaningful interactions,” a 2019 study showed that sharing of divisive articles actually increased.
- Facebook often does not provide details about changes to its algorithms. After the 2020 election, CEO Mark Zuckerberg reportedly authorized a change to the News Feed algorithm to increase visibility of so-called “mainstream publishers” and reduce visibility of ideologically aligned pages. The change, which was reported weeks after, didn’t have the stated effect and was quickly reversed.
- Facebook ignored its own internal data showing that its News Feed algorithm was fomenting political polarization, while claiming to “help communities connect.”
- Facebook and its opaque content moderation practices, such as its strike system for accounts that publish misinformation, have reportedly allowed the platform to give preferential treatment to right-wing accounts that frequently create and share this type of content.
- Facebook withholds data from researchers. In April, a leaked internal memo revealed that Facebook had a trove of data indicating that the January 6 insurrection was being coordinated on its platform. Group membership overlaps and frequent Facebook contributors are tracked internally by Facebook but the company has repeatedly refused to expand researcher access to data.
- While Facebook regularly publicizes minor policy updates as a way of suggesting progress by the company, it also hides behind intentionally vague and ineffective content moderation policies to shield itself from accountability
- The company has lied about the extent of misinformation on the platform and, instead of meaningfully addressing misinformation, spent $130 million to create an oversight board that is an ineffective distraction.
- Facebook has also released questionable metrics, allegedly inflating video engagement statistics that cost hundreds of journalists their jobs, and the platform requires reporters to jump through hoops to get access to information.
CrowdTangle is not a panacea, but it’s also among the only tools available to researchers to study Facebook and Instagram posts. Reporting on Facebook trends without the tool would be nearly impossible, and restricting access to CrowdTangle would be a major blow to misinformation researchers.
It is unsurprising that Facebook — a company that has spent years denying responsibility for spreading misinformation despite all evidence — would be hostile toward a tool that is showing just how bad things are on the platform. Facebook’s priority is preserving its public image and continuing to attract advertisers, and CrowdTangle no longer serves that purpose — showing that the company once again chooses PR spin over transparency and accountability.