After more than two hours of argument Friday morning in TikTok v. Garland, the Supreme Court appears likely to allow the U.S. government to force divestiture or shuttering of the platform on January 19. The justices may grant TikTok, and themselves, a temporary reprieve while they resolve the dispute. But the argument suggested that the Court will affirm the law.
In my view, that’s the wrong result. Never before has the government singled out a media outlet and forced its closure, much less one used by half of Americans. Even if there are legitimate national security concerns raised by the possibility that China may gain access to TikTok’s customer data, there are plainly less restrictive alternatives to protecting those interests than shuttering a platform for expression. And the government’s only other justification for its law – the possibility that China might covertly influence the content that appears on the platform – is per se impermissible under the First Amendment, which does not allow the government to regulate the content of our public debate, even if it believes the ideas expressed are against its interest, and even if they come from abroad.
All that said, how the Court reaches its result may now be more important than the bottom line.
In particular, the Court should reject the government’s principal argument, namely, that the TikTok law is “content-neutral” because it is concerned only with who controls the platform, not the content the platform features. Accepting that rationale would not only harm TikTok, but would weaken First Amendment law across the board. It is vital that the Court recognize that this type of law is content-based. If it doesn’t the Court will erase important lines that are designed to guarantee freedom of speech for all, while granting the government a powerful censorship tool.
If the Court were to accept the Solicitor General’s rationale … the government would be free to force the removal of owners of any media outlet whose fealty it did not trust.
The difference between laws that discriminate on the basis of content and those that are content-neutral is the most fundamental distinction in First Amendment law. That’s because the First Amendment requires the government to remain neutral as to the ideas and messages that make up public debate. Where a law singles out speech because of its content or viewpoint, the risk that the government is impermissibly skewing the debate is at its greatest.
By contrast, when the government enacts laws that are truly agnostic as to content, but simply have some incidental effect on speech, the concern about government manipulation of the marketplace of ideas is much less concerning. Accordingly, the Court has long insisted that where laws on their face draw lines based on content, or where the interest underlying the law is related to the content of speech, strict scrutiny, the Constitution’s most demanding standard of review, applies. Under that standard, the government can prevail only if the law is the least restrictive means to further a compelling interest. By contrast, where laws are content-neutral, a less strong interest can suffice and a looser “fit” between the law’s means and the end is permitted.
The TikTok law is clearly content-based.
The Protecting Americans from Foreign Adversary Controlled Applications Act singles out a particular communication platform out of concern about its content, and exempts other platforms based on their e-commerce content, even though they pose similar data security concerns. The government justifies the law as a response to the risk of “covert content manipulation” by China. The United States argues that even though TikTok is a U.S. company, its parent company, ByteDance, is headquartered in Beijing, and therefore subject to Chinese influence. Although China has apparently not done so yet, the government argues that China could coerce ByteDance to configure its algorithm to favor messages that further China’s interests over those of the United States. That justification is inescapably related to the content on the platform, and under longstanding First Amendment law, should trigger strict scrutiny. As Justice Neil Gorsuch said, referring to the government’s explicit interest in content manipulation, “I mean, the word — it’s kind of hard to avoid the word ‘content.’”
But that’s exactly what the United States argues, and a number of justices seemed open to accepting. Solicitor General Elizabeth Prelogar characterized the law as directed “with laser-like focus” at China’s potential control of TikTok, and noted that it does not proscribe any particular message or view. But if the Court were to accept the Solicitor General’s rationale as sufficient to make a law content-neutral, the government would be free to force the removal of owners of any media outlet whose fealty it did not trust.
Justice Samuel Alito made the point by asking the Solicitor General whether her analysis would be different if an American corporation, rather than a foreign corporation, owned TikTok and Congress had mandated its divestiture for the same reasons Congress acted here—namely content manipulation and data security. Prelogar acknowledged such a law would be content-based, but insisted that the TikTok law is different because:
all of the same speech that is happening on TikTok could happen post-divestiture. The Act doesn’t regulate that at all. So it’s not saying you can’t have pro-China speech, you can’t have anti-American speech. It’s not regulating the algorithm. TikTok, if it were able to do so, could use precisely the same algorithm to display the same content by the same users.
But Prelogar could say the same thing about Alito’s hypothetical. As long as the disfavored American corporation divested, the platform could carry all the same content and use the same algorithm. The fact that the government targets an owner rather than any specific message does not make the law any less content-based, where, as the government concedes here, its concern relates to the content that owner might promote. Controlling ownership is at least as potent a censorship tool as prohibiting particular messages, and maybe even more effective.
The fact that the owner in this case is a foreign company, ByteDance, ought not change whether the law is content-based. It’s the content TikTok’s American users might see, after all, that animates the law. In extending speech protection to corporations, even though they are not “natural persons,” the Court long ago ruled that the First Amendment protects “speech,” regardless of its source, in the interest of protecting the rights of listeners as well as speakers. It also ruled that American citizens have a First Amendment right to receive literature created by a foreign government, even if our government considered it communist propaganda, again for the same reason. Otherwise the government could freely censor the BBC, the Guardian, Oxford University Press, all based in the United Kingdom, or Politico, owned by the German publisher Axel Springer.
The Court should also reaffirm, as it stated as recently as last year’s decision in Moody v. NetChoice, that efforts to control the “mix of content” on social media platforms is at its core, “related to the suppression of free expression.” The Solicitor General sought to distinguish the TikTok law on the ground that the government’s concern is not merely with potential content manipulation, but potential “covert” content manipulation. But that makes no sense. Essentially all editorial decisions are covert (including the ones Just Security made in editing this essay), in the sense that the reader of a finished article or watching a newscast is not privy to the countless decisions made about what to cover, what not to cover, and what views to include, promote, or exclude in any expressive product. All content published by any media outlet is the result of “covert manipulation.” That fact hardly makes the government’s interest in controlling ownership of a media platform out of concern about its content somehow content-neutral.
If the Court rules that strict scrutiny applies, but this law satisfies it, that will be a huge loss for TikTok and its 170 million American users. But if the Court accepts the government’s contention that laws targeting owners of media companies are somehow content-neutral because they don’t literally specify particular messages to prohibit, the First Amendment and press freedoms more broadly would be the losers.