After a suspense worthy of Netflix’s best scripts, Elon Musk, CEO of Tesla, Starlink and SpaceX, finalized his takeover of Twitter, thus becoming the “Chief Twit.” What could have been a simple commercial operation has fueled hopes and fears for freedom of expression online since its announcement in May 2022.
Elon Musk has never hidden his ambition to save our free speech, saying it was undermined by years of arbitrary decisions on what users have the right or not to express publicly online. His other big promise at the time the deal was announced: restore the presence of high profiles banned from the platform. Much speculation surrounded the return of former President Donald Trump on Twitter. Another equally emblematic and interesting return has also occupied minds: that of the artist, producer and creator Ye, formerly known as Kanye West, after his account was restricted following a violation of the hateful conduct policy.
Far from his previous thunderous announcements, it was a more realistic and humble Musk who promised on October 28, 2022 that the social media company would set up a moderation council, responsible in particular for making decisions on account reactivations. As law professor and leading tech governance scholar Kate Klonick highlighted, Musk’s language strongly resembled that of the Oversight’s Board’s Trump decision. Was this a manifestation of a new Lex Platformia?
Lex Platformia is what I call the emerging set of rules that apply to public expression on platforms hosting content. In particular, they determine the limits of this freedom – for example, what you do not have the right to publish (largely based on respect for other rights such as privacy and safety); they explain the moderation process; and finally, these rules detail the penalties that result from their violation.
Can this set of rules put an end to a long period of democratic anomaly, during which the platforms created and applied their own rules, in great opacity? If so, the days when social media CEOs and their teams could alone decide the fate of the expression of billions of users are over. Citizens of “public squares,” as social media platforms are sometimes referred to, are demanding accountability.
One of the manifestations of this Lex resides, for example, in the very idea of the adoption of community rules or standards by any platform hosting content. From Reddit to Facebook, to Cloudflare or even Appstore, each of these platforms has a set of rules that users must follow.
Another of these manifestations is the idea of a separation of powers between those who write the rules, those who apply them, and those who oversee the action of the other two.
Decisions of the Meta Oversight Board, of which I am a member, play an essential role in clarifying the fundamental principles and values that must guide content moderation processes that are respectful of individual freedoms, operating as “case law” for the Lex Platformia.
The Oversight Board is an independent body of 23 members, responsible for making decisions on moderation by Facebook and Instagram. Since October 2020, the Board has been accepting appeals from users harmed by the moderation of Facebook and Instagram. We also accept appeals from Meta itself. The Board decides individual appeals, and also formulates recommendations and opinions on the content policies of Meta’s two main social networks. Its decisions have added legitimacy by virtue of the fact that they provide reasons for the conclusions they draw and are themselves published in a transparent manner.
To date, the Board has rendered 29 decisions, and one policy advisory opinion. For some, this relatively small number reflects a limit on the scope of our work. Refocusing on the quality and content of our decisions allows us to better judge their impact. To date, Oversight Board opinions have publicly recommended significant reforms, many of which have been followed by meaningful steps on Meta’s part:
- Translation of Meta’s community standards in several languages for greater readability and clarity of the rules.
- Notification to users of the reasons why their posts are deleted: Meta is implementing profound changes in the way the platform interacts with its users when they do not respect the rules.
- Transparency on requests for moderation received from governments: after a recommendation by the Board, Meta agreed to detail in its transparency reports the requests for moderation received from services directly attached to governments.
- Analysis of the risks related to the restoration of the access of former U.S. President Trump to his Facebook and Instagram accounts: on the recommendation of the Board, Meta will carry out a risk analysis of the potential restoration of these Facebook and Instagram accounts.
- Analysis of the impact of biases in the moderation systems of Facebook and Instagram in Hebrew and Arabic: in an independent review requested by the Board, Meta admitted in September 2022 to serious malfunctions in its moderation systems which particularly affect arab-speaking contributors.
These are just a fraction of the more than 100 recommendations made to the company by the Board. Our latest transparency report provides a more detailed account.
The Board does not do this work alone. For each of the decisions we make, each opinion we publish, we go out to meet users, civil society organizations, experts in academia, international organizations, and other stakeholders, to better understand the contours of a content moderation approach that respects our freedoms, while protecting us from danger.
The impact of the Board can be measured beyond the borders of Meta. When Google announced in May 2022 the possibility for users to request deletion of their contact information from search results, it was difficult not to remember the Board’s policy advisory of February 2022 on the publication of private residential information on Facebook.
When Spotify announced its Safety Advisory Council, to provide third-party input on harmful content, it was an encouraging sign that the Board model is bearing results.
Building a Lex Platformia can be beneficial for industry and citizens: it provides the predictability and stability needed in the face of rapid and major changes in the technology industry. So that no matter the CEO, or the technology (Web 3, Web 5, Web 1000, or Metaverse), the Lex Platformia will ensure that necessary democratic safeguards are securely in place.
If a robust Lex Platformia is in place across major platforms, we would no longer, for example, have to worry about the dismantling of human rights and Trust and Safety teams, just days before a major election in one of the largest and oldest democracies in the world.