This morning, the Senate intelligence committee questioned Facebook COO Sheryl Sandberg and Twitter CEO Jack Dorsey on Russian interference in the 2016 election. The hearing was the culmination of a two-year investigation into Russian election interference by the committee and Congress’ best opportunity to publicly hold Facebook and Twitter accountable for their role in allowing Russian operatives to game their platforms to target Americans with propaganda. As Angelo Carusone said earlier: “The tech industry’s failure to grapple with its roles in allowing -- and sometimes even enabling -- the fake news crisis and foreign interference in American elections is a national security crisis.” Today Americans had the opportunity to hear from Sandberg and Dorsey directly what Facebook and Twitter have done to protect them since 2016.
The first time tech executives from Facebook, Twitter, and Google testified before the Senate intelligence committee last year, committee members took a hostile posture. Committee chair Richard Burr (R-NC) and vice chair Mark Warner (D-VA) both scolded Facebook, Twitter, and Google for not taking election interference or the fact that their platforms were weaponized by foreign propagandists, seriously. At one point, Warner, frustrated by how little the tech companies claimed to know about what was happening on their own platforms said, “Candidly, your companies know more about Americans, in many ways, than the United States government does. The idea that you had no idea any of this was happening strains my credibility.”
Ten months later, as I watched Dorsey and Sandberg testify before the committee, it felt like relations had thawed -- perhaps not with Google, who refused to send its CEO and instead was represented by an empty chair, but certainly with Facebook and Twitter. Members of the committee continued to ask tough questions and press Dorsey and Sandberg when they weren’t forthcoming, but the atmosphere had changed. I get the sense that after nearly a year of conversations and hearings, the working relationship is perhaps in a better place.
Of course the tech companies have taken a beating in the press since that first hearing. We’ve since learned that Russian trolls got tens of thousands of Americans to RSVP for actual local events via Facebook. Americans have now seen the thousands of ads and organic content that Russian propagandists deployed on Facebook. Conspiracy theories about the Parkland shooting survivors, most of whom were still minors, spread like wildfire on social media. News broke that Cambridge Analytica had breached data of at least 50 million Facebook users. Russia is still interfering in our political conversation, and, Iran is now gaming the platforms as well.
This morning’s hearing was probably the last time we’ll hear from the tech companies or the committee before the midterm election. Here’s what we’ve learned (and what we still don’t know):
Promises made, promises kept?
Facebook and Twitter made a lot of promises to the committee in the 2017 hearing. Facebook and Twitter both promised to change their ad policies, enhance user safety, build better teams and tools to curb malicious activity, better collaborate with law enforcement and one another, and communicate more transparently with the public.
How’d they do?
-
Updated ads policy. Both Facebook and Twitter have announced new political and issue ad policies. Both companies have also announced their support for the Honest Ads Act. During the hearing, Sen. Ron Wyden (D-OR) asked Facebook specifically about voter suppression ads which both Russia and the Trump campaign used in 2016. Sandberg said that in the future, this kind of targeting would not be allowed, though she didn’t specify if she was talking about just foreign actors or American political campaigns as well.
-
User safety. Perhaps the most telling moment of the hearing was Sen. Martin Heinrich (D-NM) asked Sandberg about the real harm done when real people (not just fake accounts) intentionally spread conspiracy theories. Sandberg’s solution, rather than removing the incendiary content, was to have third-party fact-checkers look at potentially incorrect content because, according to her, Facebook isn’t the arbiter of truth, mark the content as false, warn users before they share the content and present users with “alternative facts.”
-
Build better teams and tools to curb malicious activity. In her opening statement, Sandberg said: “We’re investing heavily in people and technology to keep our community safe and keep our service secure. This includes using artificial intelligence to help find bad content and locate bad actors. We’re shutting down fake accounts and reducing the spread of false news. We’ve put in place new ad transparency policies, ad content restrictions, and documentation requirements for political ad buyers. We’re getting better at anticipating risks and taking a broader view of our responsibilities. And we’re working closely with law enforcement and our industry peers to share information and make progress together.” Dorsey also highlighted Twitter’s progress in his opening statement, saying: “We‘ve made significant progress recently on tactical solutions like identification of many forms of manipulation intending to artificially amplify information, more transparency around who buys ads and how they are targeted, and challenging suspicious logins and account creation.”
-
Better collaboration with law enforcement and with one another. Committee members asked Dorsey and Sandberg about this multiple times during the hearing. Both agreed that when it came to American security, Twitter and Facebook weren’t in competition and collaborated frequently. They also expressed a good relationship with law enforcement agencies, though Dorsey complained more than once about having too many points of contact.
-
Communicate more transparently to the public. Committee members pressed both Dorsey and Sandberg to be more transparent. Warner asked Dorsey if Twitter users have a right to know if the account they’re interacting with is a bot. Dorsey agreed to this, adding the caveat that “as far as we can detect them.” Warner suggested to Sandberg that most of Facebook’s users don’t know what data Facebook has on them or how that data is used. Further, Warner pressed Sandberg, asking if users had a right to know how much their data was worth to Facebook. Wyden pointed out that data privacy is a national security issue as Russians used our own data to target us, saying, “Personal data is now the weapon of choice for political influence campaigns.” Sen. Susan Collins (R-ME) asked Dorsey if Twitter had done enough to disclose to users that they were exposed to IRA propaganda, which Dorsey admitted the platform had not yet done enough.
Questions still outstanding
For every question Sandberg and Dorsey answered during the hearing, there were plenty that they couldn’t or wouldn’t answer. Most of the time, they promised to follow-up with the committee but here’s what we still don’t know and won’t likely get an answer to before the 2018 elections:
-
What are the tech companies doing to prepare for “deepfake” video and audio? Sen. Angus King (I-ME) asked if the companies were prepared to combat “deepfake” videos and audios, content that is digitally manipulated to look and sound extremely real. Neither Sandberg nor Dorsey had a good answer, which is worrisome given that “deepfake” audio and video are just around the corner.
-
Are the tech companies keeping an archive of suspended and removed accounts and will make this archive available to researchers and/or the general public? Both Sens. Roy Blunt (R-MO) and James Lankford (R-OK) asked about this. which is an important question, especially for academic researchers. Neither Sandberg nor Dorsey had a clear answer.
-
Anything to be done with the selling of opioids online? This question came from Sen. Joe Manchin (D-WV) who also asked Sandberg and Dorsey if their companies bore and moral responsibility for deaths caused by opioid sales on social media.
-
How much did tech companies profit from Russian propaganda? Sen. Kamala Harris (D-CA) has asked Facebook this question repeatedly both during intelligence and judiciary committee hearings. The most follow-up she’s received from Facebook is that the number is “immaterial.”
What happens next?
Burr and Warner generally close these hearings by previewing what happens next. This time there was no such preview. Given that the election is almost two months away, that’s a bit unsettling. But the reality is that with the current makeup in Congress (and the executive branch), the government isn’t going to do anything else to protect Americans. No legislation will be passed, and if social media companies are called to testify before the House again anytime soon, it will likely be another circus hearing devoted to the right’s pet issue of social media censorship. On the Senate’s part, however, holding tech companies accountable and producing reports is about as much as the intelligence committee can do right now.
Facebook, Twitter, and the absentee Google left today's hearing with questions unresolved and problems nowhere near fixed. Beyond the Senate Intelligence Committee asking pertinent questions, Congress has shown no interest in holding social media companies to account for those issues that remain outstanding.