On one level, the Supreme Court’s recent decisions in the NetChoice social media cases produced an anticlimactic do-over. The justices unanimously set aside conflicting lower court rulings on laws enacted by Texas and Florida that restrict how social media companies moderate content on their platforms – and told the lower courts to go back to the drawing board.

The New York Times described the consolidated ruling as “the most recent instance of the Supreme Court considering — and then dodging — a major decision on the parameters of speech on social media platforms.”

Granted, the high court did not provide any final answers on the constitutionality of the controversial Texas and Florida laws. But in a majority opinion written by Justice Elena Kagan, six members of the Supreme Court, including three of its conservatives – Chief Justice John Roberts, and Justices Brett Kavanaugh and Amy Coney Barrett – agreed that the First Amendment protects content choices made by social media companies in much the same fashion that it protects editorial choices made by traditional newspapers. That in itself constitutes an important clarification of how free speech operates online.

Specifically, the Kagan majority said the U.S. Court of Appeals for the Fifth Circuit was flatly “wrong” when it upheld the Texas law based on the notion that social media companies are not covered by the First Amendment when they rank, remove, or label content. The Fifth Circuit’s decision showed “a serious misunderstanding of First Amendment precedent and principle,” Kagan wrote. 

By contrast, she found that the U.S. Court of Appeals for the Eleventh Circuit “saw the First Amendment issues much as we do” when it struck down the relevant Florida restrictions on social media companies. Republican majorities in both the Texas and Florida legislatures passed the laws to rein in Facebook, YouTube, and other platforms, which they accused of marginalizing or removing conservative content for partisan reasons – an allegation unattached to systematic or empirical evidence. Texas prohibited removing material based on its “viewpoint,” while Florida barred restrictions on content by or about political candidates.

The Supreme Court vacated both lower court rulings because neither had determined with sufficient specificity which platforms and which platform features are covered by the state laws, a detailed inquiry that the Justices declined to undertake themselves. 

But the Court did place social media companies squarely within the circle of private organizations that the government generally may not order to carry messages they wish to avoid. Kagan’s opinion thus freed the companies to determine for themselves what speech they wish to host and amplify. It is difficult to see how lower courts applying the ruling in good faith could uphold the main provisions of either the Texas or Florida laws – or similar statutes under consideration in a number of other states

At the same time, however, Kagan’s majority opinion left state and federal lawmakers with leeway to approve more narrowly tailored legislation requiring social media companies merely to disclose information about their operations, as opposed to second-guessing their content moderation decisions.

In a concurring opinion, Justice Ketanji Brown Jackson said she agreed with the principles announced by Kagan, but wrote that the majority should have avoided previewing its skepticism of the Texas and Florida laws to the degree that it did. Concurring in the decision to set aside the lower court rulings but agreeing with none of the majority’s reasoning were Justices Samuel Alito, Clarence Thomas, and Neil Gorsuch.

Three Core Principles

Kagan distilled three core principles from a line of Supreme Court precedents beginning with the seminal 1974 decision in Miami Herald Publishing Co. v. Tornillo. In that case, the high court struck down a Florida law requiring that a newspaper give a political candidate a right to reply when it published criticism of his record. 

The first principle is that the First Amendment applies to any government mandate that a private organization publish or disseminate messages that the organization would prefer to exclude. The application of the First Amendment to this conduct means that the government would have to justify such a requirement with a “compelling” interest of the sort that courts almost never accept as adequate. (In an exceptional 1997 case, for instance, the Supreme Court upheld federal “must carry” rules for cable television systems because of the importance of the government interest in preserving local broadcast stations.) 

The second principle, Kagan said, is that the First Amendment applies even if the private organization in question isn’t terribly choosy and includes most messages while excluding only a few – a description that roughly describes social media platforms. In other words, First Amendment protections still apply even if an organization only excludes the handful of messages it most disfavors, Kagan wrote.

Third, the government cannot regulate speech because it claims to seek a balance of viewpoints or a diversity of expression, which is what elected officials in Texas and Florida said they sought. It isn’t up the to the government to achieve its preferred mixture of views in the marketplace of ideas, according to the majority opinion.

A Different Standard for Disclosure

But when it comes to disclosure requirements – in this context, mandates that social media companies reveal how and why they operate the way they do – Kagan indicated that courts should take a more deferential stance, allowing lawmakers with leeway to mandate transparency, as long as the requirements pertain to factual information and are not “unduly burdensome.” On this point, the majority opinion repeatedly referred to Zauderer v. Office of Disciplinary Counsel of Supreme Court of Ohio, a 1985 Supreme Court case that established the more deferential standard for reviewing disclosure requirements. Some advocates for the social media industry had argued that Zauderer did not apply in this context – a view that the majority implicitly rejected. 

Preserving Zauderer is important for maintaining the possibility for more carefully designed government oversight of social media companies. In that case, the Court upheld an Ohio state rule of professional responsibility, which required that attorneys whose fees depend on winning a judgment must disclose in their advertisements that clients might be liable for litigation costs, even if their lawsuits are unsuccessful. The high court applied, in essence, a “rational basis” review, asking whether the required disclosures were factual, uncontroversial, “reasonably related to the state’s interest in preventing deception of consumers,” and not “unduly burdensome.” 

Since its decision in Zauderer, the Court has not directly addressed whether the standard should be applied outside the commercial advertising context, but all federal appellate courts that have done so have recognized that the precedent’s reasoning applies more broadly. The core of that reasoning, in the high court’s words, is that “disclosure requirements trench much more narrowly on an advertiser’s interests than do flat prohibitions on speech.”

Under Kagan’s approach, federal and state lawmakers likely remain free to consider imposing carefully drafted disclosure requirements on social media companies – and if evaluated under the deferential Zauderer standard, such mandates should have a reasonable chance of surviving constitutional review by the courts. This is not an academic issue; bills such as the bipartisan Platform Accountability and Transparency Act (PATA) have been introduced in Congress, although so far they have not made much progress. PATA would “require social media companies to provide vetted, independent researchers and the public with access to certain currently secret platform data,” according to a press release.

A Path Toward Consumer Protection Enforcement 

This kind of transparency, which the European Union is trying to achieve with its new Digital Services Act, could do more than improve public understanding of how platforms operate; it could also help regulators at the Federal Trade Commission (FTC) determine whether social media companies are fulfilling the promises they make to users in their terms of service about moderating content. If those promises are being broken – by, for example, the companies failing to devote adequate resources to screening content – those failures could become the basis for FTC consumer protection enforcement

Some critics of the Court’s ruling in NetChoice, including Tim Wu, a law professor at Columbia University and former aide to President Joe Biden, warn that it illustrates how “the First Amendment is out of control.” Wu argues that by protecting social media companies’ ability to cull content under the First Amendment, the Kagan majority is inhibiting legitimate government attempts to regulate powerful business interests. 

But NetChoice does not insulate social media from all government oversight and enforcement. To the contrary, it provides lawmakers with latitude to regulate in a manner that does not dictate a single approach around how platforms are to grapple with specific types of content.

The upshot of the NetChoice ruling is that the most promising path for improved oversight of social media – one that respects First Amendment principles – points toward thoughtfully drafted disclosure laws that shed light on how social media companies operate.    

Barrett is the deputy director of the NYU Stern Center for Business and Human Rights. The Center filed an amicus brief in support of none of the parties in the NetChoice cases. 

IMAGE: Collage image using speech bubbles and smartphones to communicate an online conversation.