Editor’s note: This article is part of a new series from leading experts with practical solutions to democratic backsliding, polarization, and political violence.

Elon Musk, the world’s richest person, is actively using X (formerly Twitter), the social media platform he bought in 2022, to promote his far-right political interests. Vittoria Elliott at WIRED writes that Musk has made the platform “his own personal political bullhorn.” Tech journalist Casey Newton says that X is now “just a political project.” 

Even before his endorsement of former President Donald Trump in July, Musk had already changed X in substantial ways that favor MAGA politics and personalities. He restored accounts of far-right figures such as white nationalist Nick Fuentes and conspiracy theorist Alex Jones, relaxed content moderation policies, and disposed of much of the company’s trust and safety and election integrity teams

But after the endorsement, Musk’s efforts on behalf of the Trump campaign appeared to surge. He launched a political action committee dedicated to Trump’s reelection. He hosted a livestream with the former president, whom Musk coaxed back onto the platform after he was previously banned for incitement to violence after the January 6, 2021 attack on the U.S, Capitol. According to The New York Times, Musk “hired a Republican operative with expertise in field organizing” to help direct his political activities. And according to Trump, Musk has agreed to head a “government efficiency” task force that aims to make “recommendations for drastic reforms” across the entire federal government should Trump win a second term. All the while, Musk continues to promote his preferred brand of politics through posts on his own account, which has close to 200 million followers (and frequently appears in users’ feed as recommended content whether they follow him or not).

While it may be jarring for users and others who hoped the platform might not succumb so completely to the interests of its singular owner, it is Musk’s prerogative to shape X to his tastes. No social media platform is truly politically neutral. And it is not a novel circumstance that the singular owner of a platform has such extraordinary influence over its operations, policies, and political decisions. But Musk’s use of X to try to influence the 2024 U.S. election is the most extreme example to date of a person in his position using a major social media platform for a specific political outcome. It is so blatant and outside the norm, it can be considered a class of its own. 

This unique scenario offers an opportunity to consider the various ways that social media platforms could be used to provide advantages for one political campaign over another, and where the ethical and legal lines might currently be drawn. The exercise may also be useful for X staff and board members, who should think carefully in advance about the potential legal and public policy questions arising from Musk’s politicking, including what actions could trip into possible violations of campaign finance and elections law. 

Where are the red lines? 

The operators of social media platforms regularly make design, technical, product, and policy decisions that may favor certain parties and their politics. In the past, though, they have typically done so incidentally or to protect themselves from accusations, good faith or otherwise, of political bias. For example, there are many past cases of platforms shielding conservative users from content moderation. However, we are not aware of a clear example in which the owners of another major social media platform acted to intentionally boost a specific candidate or party. 

Musk is different. So his employees may do well to be on the lookout for evidence they are being directed to engage in political manipulation. But not all acts of political manipulation are equal. Some may violate the law, while some are likely legal even if unethical. And some may not be clearly unethical either.

Below, we present possible ways that various aspects of the platform, its policies, and its advertising products could be used to specifically advance the interests of a candidate for office. In order to assess where legal and ethical lines might be drawn, we asked experts on privacy, campaign finance, and election law about different forms of potential manipulation of such a social media platform. Based on their insights, we sorted these possibilities into three categories: those that are potentially illegal, those which fall into a gray zone, and those which raise serious policy questions but few legal risks for X and Musk. 

Potentially illegal use of social media tools

Platform operators have broad First Amendment rights to determine which speech they allow, how they moderate content, and with whom they do business. Many of their decisions are also shielded from liability by Section 230 of the Communications Decency Act

But platform operators are also restricted by Federal Election Commission (FEC) regulations, which stipulate that certain actions might be considered in-kind campaign contributions. And, of course, actions taken by corporations or individuals that encourage violence or interfere with electoral processes are also subject to laws governing voting rights and incitement.

Even so, the bar for proving a violation of these laws is often high. For FEC violations, in particular, it is typically necessary to show, with concrete evidence, that an actor coordinated directly with a campaign. Though this high bar might benefit X in its current incarnation, it also protected Twitter from conservative critics in the past. For instance, an Aug. 2021 decision by the FEC found the company did not make a material contribution to the Biden campaign by briefly blocking links to a New York Post article about Hunter Biden’s laptop. 

There are additional considerations as well. For example, political action committees (PACs) and other non-campaign entities are not bound by the same rules as campaigns. FEC rules also carry a “press exemption,” which might apply to X. The FEC is politically deadlocked, making enforcement through it even more unlikely. And the Justice Department appears focused on less controversial conduct. Violations can be difficult to detect given the opacity of social media platforms and their internal operations. Legal proceedings – FEC or otherwise – can take months or years. By the time a case is resolved, the election could be long past. 

But even if laws and regulations go unenforced, potential violations are still a matter of public concern. Were alleged violations to come to light, they could inspire future congressional hearings, investigations, or legislation. They might also lead to pressure from advertisers or others on Musk to change. And while enforcement may take time, a well-founded criminal indictment or the opening of a formal investigation of a company could also help propel actors toward legal compliance. 

What follows are three areas that may raise legal concerns:

1. Advertising and monetization. Consider if Musk decided to offer discounted advertising rates to favored campaigns or PACs; or if he implemented stricter approval processes for disfavored political actors. Indeed, Musk’s live-streamed interview with former President Trump X is already under scrutiny, as some observers have wondered if the Trump campaign paid to promote the event and, if so, how much it paid. X also heavily promoted the interview with a designated hashtag and in its “trending” section for users. Neither X nor the Trump campaign has provided the public with the details of their arrangement. 

According to Adav Noti, executive director of the nonpartisan Campaign Legal Center (CLC), charging differential rates for campaign advertising would likely be illegal. 

2. Incitement to violence. Another scenario to contemplate is Musk, through his account or platform support for other accounts, encouraging political violence during or after the election. He could, for example, encourage militia members to stake out polling places, encourage violent demonstrations, or invoke ideas of a “civil war” (as he recently did in the United Kingdom). In terms of technical tools, one should consider scenarios in which Musk could manipulate the content recommendation algorithm or interfere in internal enforcement against similar content. Of these possibilities, posts by Musk directly inciting violence or issuing what Noti called “legally cognizable threats” are the most likely to violate the law; but the bar for proving incitement to violence in court can be quite high. 

[Editor’s note: Readers may be interested in Berke Gursoy, “True Threats” and the Difficulties of Prosecuting Threats Against Election Workers]

3. Interference with election processes or voting rights. Another scenario to consider is Musk making statements or using X to amplify statements violating state and federal election laws about voting rights and election interference. He has already publicly embraced conspiracy theories about election fraud by, for example, claiming that non-citizens are preparing to vote in the upcoming election. Building on these conspiracy theories, Musk could encourage protests and election “observers” (including armed militia members) to monitor polling places. Alternatively, Musk could propagate conspiracies that encourage rogue poll workers and other potential saboteurs to take actions that threaten the election’s integrity. These scenarios might violate both recent and longstanding civil rights laws, such as section 11 of the Voting Rights Act or the KKK Act, but precedent for their enforcement is evolving. For example, the pending case of United States v. Mackey concerns a man charged under the Enforcement Act of 1870 with using his social media presence to spread false information about voting procedures in a way intended to suppress voter turnout. 

That case may be an exception in terms of social media being the mechanism, while the existing laws need to be updated, not just for the artificial intelligence (“AI”) age, but the internet age in general. As a case in point, prosecutors charged the man allegedly responsible for generating a deepfake call of President Joe Biden in New Hampshire under a law that specifically applies to robocalls. A social media post containing the same type of deepfake might not have even triggered prosecution under the same law because social media firms are not regulated in the same way as telecoms (see, for example, the Telephone Consumer Protection Act of 1991, codified at 47 U.S.C. § 227, and its definition of “automatic telephone dialing systems;” but see also the federal law prohibiting fraudulent misrepresentation of campaign authority, 52 U.S.C. § 30124, and analysis of its application to AI/deepfakes). 

Legally questionable and ethically dubious

Other actions Musk might take fall into a legal gray zone, in which an argument could be made for illegality but existing precedent is unclear. Though ethically dubious, these acts might not violate existing law (or might violate it in ways that are not enforceable). Some might even argue (perhaps cynically) that these kinds of manipulations are common for both sides of the aisle in contemporary politics.

4. Recommendation algorithms and visibility of content. One potential scenario to consider is if Musk chose to alter how X delivers content to users in order to promote certain accounts or narratives over others. For example, consider if he were to use recommendation algorithms to show content promoting one candidate to more users, or conversely to show supportive messages about the candidate’s opponent to fewer users. He could also manipulate features such as trending topics or hashtags to bury content unfavorable to a candidate, boost content friendly to their campaign messaging, or do the opposite for the candidate’s opponent. The CLC’s Noti said it is possible a judge could regard manipulation of recommendation algorithms as a material contribution to a campaign, but that it is “at minimum a gray area.” It is also possible courts would find such activity to be protected speech.

5. Disinformation and synthetic content. Another scenario is if Musk chose to allow certain types of behavior to go unchecked, such as coordinated disinformation, inauthentic behavior benefitting political allies, or allowing the spread of misleading content that would otherwise violate content policies. He has already shared a video featuring a clone of Vice President and Democratic presidential nominee Kamala Harris’s voice. There is currently no federal law prohibiting misrepresentations of this sort (the federal law prohibiting fraudulent misrepresentation of campaign authority is limited in scope and applies only to conduct perpetrated by a candidate or their agents).

Senators Amy Klobuchar (D-MN) and Lisa Murkowski (R-AK) have introduced a bill requiring transparency disclosures on ads with “AI-generated content.” Even if such a law existed, though, perpetrators may argue that such deepfakes are protected as satire. The Harris video contained such sarcastic messages it would be hard for a relatively informed voter to believe it was authentic. A number of state laws do exist, but they are new and largely untested. 

6. Other forms of support. Other scenarios to contemplate include if Musk offered other valuable support to a favored political candidate, such as providing more human resources to assist one campaign in utilizing the platform than their competitor, providing early access to new features or tools, using company resources to conduct voter outreach, or even providing direct monetary support. There are many years of precedent governing corporate contributions such as get-out-the-vote campaigns, which must be “nonpartisan.” If it could be proven that X intended to benefit one campaign over another, such activity might be illegal. Other forms of support have proven controversial, but do not appear to be illegal unless they are offered on preferential terms. This was a subject of debate following revelations about the embedding of Facebook and Google staffers in the 2016 Trump campaign. 

Probably legal, whether ethically right or wrong

Finally, under a number of federal laws (or lack thereof), there are ways Musk could use X to tip the electoral scale that are almost certainly legal, even if they raise serious ethical questions. These include: 

7. Interfering in content moderation. Musk could influence the visibility of certain viewpoints through selective enforcement of content moderation policies. This might also include overruling moderation decisions on high-profile posts, including those of leading candidates, or handling certain accounts differently from others, such as political influencers. Asymmetries in how moderation resources are allocated, as well as policy adjustments that favor one viewpoint over another, could result in favoring certain candidates and parties. Imagine if, for example, the United States were to see another round of protests against police violence, and X were to take a light touch approach toward hate speech and violent incitement by MAGA supporters against protesters. Recent rulings by the Supreme Court make it more likely this would be considered protected First Amendment expression, and such conduct presumably would also be protected under Section 230 of the Communications Decency Act. 

8. Targeting and user data. Manipulative behavior in this domain might include providing privileged access to user data for a preferred candidate or campaign’s targeting, unique lead generation, data sharing (including through custom APIs), data collection mechanisms, or even the selective enforcement of privacy policies to advantage or disadvantage parties. As with preferential advertising rates, one could argue this is a material contribution to a campaign. However, legal experts we spoke to said it would be easy for X to argue in court that this was a business decision to help deliver content of interest to its user base. It would take exceptionally compelling evidence to prove X had coordinated with the Trump campaign to provide a material contribution in violation of FEC rules.

9. Marketing and messaging channels. Musk could choose to utilize X’s own communications mechanisms, such as emails or in-app notifications, to issue announcements or selectively communicate to certain targeted parties. Or, Musk might privilege certain accounts, as he appears to have done for himself after firing an engineer because he felt engagement with his account was too low. Craig Holman, an expert on campaign finance issues at the watchdog group Public Citizen, told us this would probably not run afoul of federal election law unless coordination with a campaign could be proven.

10. Search and AI-generated content and material. Musk could manipulate other platform functions to favor one partisan side over another. Such use functions range from standard search tools to newer generative AI tools, and include automatically generated snippets, as well as information in knowledge panels, the boxes that appear when a user searches for people, places, organizations, or other things that are in Google’s Knowledge Graph. Musk’s AI chatbot, Grok, has already been criticized for falsely claiming Harris is ineligible for the ballot in many states — leading five secretaries of state to send Musk a letter asking him to implement better guardrails. But this is an emerging area with little federal regulation at present. 

After the election

The above scenarios consider potential actions Musk might take before the election. Efforts to manipulate communications after an election present unique threats. In the event of Trump’s loss, all of the above techniques could be employed to boost claims of electoral improprieties—damaging public confidence, pressuring key decision makers (such as boards of election or judges), or stoking offline unrest. 

Imagine the online chaos that would follow if enough election boards refused to certify an election, leading to court challenges and recounts disrupted by violent protest. (It has happened before.) Now imagine not only a total failure to put out the fire, but a decision at the top of X to fan the flames. Given Musk’s brazen willingness to spread false claims (and even manipulated media), many worry about what he will be willing to say or do if Trump loses the election and tries extra-legal means to overturn the results.

There is a very real risk that Musk would aid Trump’s efforts to subvert the election outcome. Potential whistleblowers inside of Musk’s X may be the best defense – especially if they act in real time – against such anti-democratic behavior. Taking action as a whistleblower carries considerable risks. But even if there are few legal consequences for Musk or X, public controversy might alter their behavior in important ways – or encourage policymakers to pursue reforms. 

Congress should act—but barring that, whistleblowers may be the best hope

If legislators worry about the political influence of billionaires — as both sides of the aisle now seem to — the above scenarios offer a guide to curtailing it. In the era of Citizens United and a gridlocked, anemic FEC, campaign finance reform is an obvious place to start. That reform should include penalties capable of meaningfully deterring deep-pocketed actors like Musk from unlawful behavior. Clearer laws governing incitement to violence, voter intimidation, and interference in election proceedings would be a meaningful step. Congress also has yet to weigh in on the ongoing legal fights about section 230, content moderation, and the editorial discretion of social media companies, preferring to let the Courts sort it out. 

Another area where Congress could act, but has not, is transparency legislation, which would require platforms to share data with researchers about their content recommendation and advertising systems. Such legislation would empower watchdogs in both civil society and government. State legislatures are to be commended for acting on synthetic media in political campaign contexts, but that is not a substitute for federal regulation. State bills also pose a variety of First Amendment questions about satire and other protected speech that Congress should consider more carefully. 

In the meantime, concerned parties within X should not hold their breath waiting for legislative action. If X engages in clandestine actions to the benefit of a political candidate, whistleblowers may be the best hope for accountability.

IMAGE: Elon Musk, left center, and Wendell P. Weeks, right center, listen to President Donald Trump, right, as he meets with business leaders at the White House on Monday January 23, 2017 in Washington, DC. (Photo by Matt McClain/The Washington Post via Getty Images)