On November 1st, Facebook’s general counsel, Colin Stretch, will testify before the House and Senate intelligence committees as part of the congressional investigations into Russia’s use of social media and internet platforms to interfere with the 2016 presidential election. His testimony will likely shed more light on Russian information warfare tactics on Facebook beyond the isolated question of elections.
But the reality is that Facebook failed to fully comprehend the scale of the attack and has refused to provide a full public account of what happened. Given its half a trillion dollars in market capitalization and its monolithic role in shaping the consumption of news, information and opinion, now is the time to demand that Facebook answer to the American people. Facebook users have the right to know exactly how they individually interacted with Russian propaganda in Moscow’s effort to alter their beliefs and motivate their actions. If Facebook cares about its users it should realize the moral responsibility to act without waiting for threats of boycotts, litigation, or legislation.
The Scale of the Problem
From the reporting of journalists and independent researchers, we know that Russian use of Facebook was extensive, and employed several tools available on the platform. For instance:
- Jonathan Albright, a researcher at Columbia University’s Tow Center for Digital Journalism, analyzed data on six of the Russian sites that have been made public — Blacktivists, United Muslims of America, Being Patriotic, Heart of Texas, Secured Borders and LGBT United — and found that the content had been “shared” 340 million times. Given that as many as 470 Russian pages have so far been identified, impressions were likely in the many billions. (Facebook later scrubbed the posts and data that made Albright’s work possible).
- Independent journalists in Russia reporting on the operations of the St. Petersburg troll farm, the Internet Research Agency, talked to a source who claimed that agency achieved as many as 70 million page views in just one week in October 2016. The agency is the best known; some estimate there are hundreds of these operations in Russia and in former Soviet states.
- Trolls went so far as to pay American activists to use Facebook to plan dozens of events across the country intended to foment division. Activists subverted in this campaign report being entirely unaware they were working on behalf of foreign propagandists.
- In addition to posting pages, events and ads, Russian propagandists messaged Americans directly using Facebook Messenger.
- Facebook disclosed that known Russian agents purchased at least $100,000 in advertisements, using mechanisms such as Custom Audiences on the platform to target specific audiences and geographies such as swing states including Michigan and Wisconsin.
- The above summarizes only what is known to date. It seems likely more revelations will come. For instance, in the run-up to the French election, Facebook deleted 30,000 fake accounts; that number was later quietly revised up to 70,000. How many fake accounts operated in the United States in 2016?
The Right to Know What Happened to You
Facebook users deserve to know exactly how they were exposed to the tactics described above. While regulating speech on Facebook is fraught and policing bad actors on a platform of more than 2 billion users will never be perfect, the company can prove its commitment to transparency by creating a mechanism to inform users that interacted with propaganda.
One way to think about how to enshrine such a requirement is to look to “Right to Know” laws developed to inform citizens of exposure to otherwise hidden phenomena such as chemicals, pathogens and various forms of pollution. These principles are incorporated into laws at the federal, state and local levels. There are many examples in other sectors of our economy and life:
- A medical facility, even when it is not to blame, may be obliged to notify you if you were exposed to a communicable disease while passing through.
- Consumers have a right to know when they have been misled by false advertising, and the proper remedy is often to notify them directly after the fact.
- If your car has a defective part, the manufacturer is obliged to notify you directly and issue a recall.
- Companies have obligations to directly notify individuals in the event of data security breaches compromising their private information.
Facebook owes its users a similar standard of care, especially since it is the only responsible party with the needed information.
It is important to recognize that the problem extends beyond people who were directly duped by Russian propaganda. Even more people were likely exposed to content shared with them through Facebook friends and social networks–in other words, they were more likely to be influenced by propaganda conveyed to them through personal validators. Facebook was also not simply a passive platform in this respect. It facilitated Russia’s targeting specific audiences and most likely channeled users toward propaganda through its automated recommendation engine.
What can Facebook do now? The company should provide a feature – similar to current applications on the Facebook platform that allow users to see past interactions with friends and with advertisers – giving users the ability to understand their exposure to known Russian propaganda, before, during, and after the 2016 election. Alternatively, Facebook could simply inform people through a notification delivered through their usual activity on the platform. Regardless of the specific vehicle used to inform them, users should be able to see how organic content and advertising messages from identified Russian sources appeared in their timeline, whether they interacted with such media, if they engaged with events or received messages from propagandists, and basic metrics that quantify their overall exposure. This is not hard.
Tawanda Jones, the sister of an African American man who died in police custody, participated in an event instigated on Facebook by Russian propagandists using a page called Blacktivist. Jones told the Guardian that Facebook should inform users who followed fake accounts like Blacktivist.
The harm to people on Facebook is not simply a “one-off” experience involving the presidential elections. First, Russian active measures on Facebook started long before the 2016 primary season and lasted long after November 8th. The issue here is broader than election interference and whether Russia’s actions affected public trust in their outcome. Second, individuals’ exposure is not something left in the past. What they were told can easily affect how individuals think about the world in which they live today. Facebook holds the keys to correct for the company’s own admitted mistakes in facilitating Russia’s ability to target these Americans.
The Right to Know To Prepare for the Future
It is important that we understand what happened in 2016, but it is perhaps even more important that we prepare for future attacks, whether from Russia or other nation states that conspire to influence American voters and citizens in general. Technology platforms will evolve in response to these threats, but so will the tactics of our enemies. We know that the troll farms that drove the Russian campaign in 2016 were powered by people; new technologies will drive automation and could drastically increase the volume and effectiveness of propaganda. The worse case scenario is a future in which massive amounts of personal data is combined with sophisticated technology to produce and deliver unaccountable hyper-personalized media and messages at enormous scale.
The right to know is about ensuring Americans understand the nature of these attacks and how they are affected by them so that we can create mechanisms that are appropriate in response. It’s about creating feedback for users of social media platforms so that they are aware when they are targeted by propagandists, thus raising general literacy of 21st century information warfare tactics. It’s about an important technique for giving Americans an opportunity to rethink beliefs that have been informed by Kremlin propaganda. It’s about changing the incentives of malicious actors by denying them the effects they desire. And it’s about ensuring a transparent relationship between the technology platforms that shape how we see the world and the citizens of the democracy that created the conditions for their success.
Let's get Facebook's attention with this. We almost have enough signatures. https://t.co/OM0EsERAB8
— George Takei (@GeorgeTakei) October 15, 2017
The innocent, early promise that social media and internet platforms are an unqualified positive for democratic society is clearly dead. But Facebook – the largest social media platform in the world – can lead the way toward restoring trust by championing the right to know. We hope the company will start on November 1st.