To Censor or to Host? Supreme Court Hears Social Media Platforms’ Free Speech Challenge to State Laws

To Censor or to Host? Supreme Court Hears Social Media Platforms’ Free Speech Challenge to State Laws
Merakist/Unsplash.com
Jack Fitzhenry
Updated:
0:00
Commentary

The Supreme Court on Monday heard hours of argument in two free-speech cases, Moody v. NetChoice and NetChoice v. Paxton.

NetChoice, an industry group representing large tech companies, argued that laws enacted by Texas and Florida restricting the companies’ ability to demote or remove user content violated the First Amendment rights of social media platforms.

Listening to NetChoice’s arguments, one gets the impression that the states are forcing social media to broadcast pro-terrorist, pro-suicide messages.

They are not. And the hysterics of NetChoice’s hypotheticals should make a fair-minded person skeptical.

Rather, the cases are, to paraphrase the late historian and scholar Christopher Dawson an effort by the nation’s de facto social powers to exempt themselves from all interference by lawful political authority.

The dominant social media platforms—Facebook, X (formerly Twitter), et al.—attained their dominance by marketing themselves as an open digital marketplace for ideas. The question now is whether states have any authority to regulate these private businesses to keep the digital “public square” available to speech and speakers that the platforms disfavor.

Whether states have such authority depends on how much of social media platforms’ operations are protected by the First Amendment: All? Some? None?

Do platforms such as Facebook merely host the speech of others, like providers of cellphone service? Is algorithmic content curation inherently expressive like the decisions of a newspaper’s editorial board? Is the decision to ban certain users censorious in the expressive sense or merely censorship?

Texas and Florida contended that little of what social media platforms do to user content qualifies as expressive. Throughout oral arguments, the solicitor generals of Florida and Texas maintained that when the platforms demoted or promoted, hosted or banned, they were engaged in conduct, unprotected by the First Amendment, not in speech.

Thus, it was perfectly constitutional for Florida to prevent social media sites from deplatforming candidates for public office or for Texas to prevent platforms from deleting posts based on viewpoints.

Advocating for NetChoice, former U.S. Solicitor General Paul Clement argued that virtually everything platforms do is expressive because their very business is to disseminate speech.

Justices Ketanji Brown Jackson and Neil Gorsuch took a more nuanced view. Rather than looking at the nature of social media’s business, both justices insisted that the matter needed to be assessed at the level of function.

Facebook, for instance, offers users the ability to post public comments or to send someone a direct message. Contrary to Clement’s assertions, the justices indicated that the First Amendment concerns would be different when the platform was simply transmitting a message versus when it was curating or promoting content.

Even the word for what the platforms do with user content became a matter of semantic dispute.

Justice Clarence Thomas pressed Clement for a case in which the court had declared that “censorship” enjoyed First Amendment protection. Justice Samuel Alito described as “Orwellian” the platforms’ insistence that deleting users and posts was merely “content moderation.”

Justice Brett Kavanaugh, in a plaintive echo of the Reagan era, rejoined that “When I think of Orwellian, I think of the state,” implying as he did throughout the argument that censorship cannot be an issue when a private company is the censor.

Other questions were raised without answers.

Were social media platforms common carriers like telephone companies and thus duty-bound to welcome all patrons? Were the states’ laws making pernicious content-based distinctions favoring conservative speech or were they simply preventing the platforms from discriminating based on speech content?

Even Solicitor General Elizabeth Prelogar made a guest appearance on behalf of the federal government (not a party to either case) to argue that while not every business transmitting speech enjoyed First Amendment protection, contra the states, there was no common-carrier exception to the First Amendment.

Yet these conceptual concerns, so central to the case, are unlikely to be the basis of the court’s decision this term.

That’s because the dominant theme preoccupying the justices during oral argument was the cases’ peculiar procedural posture. NetChoice, in its haste to stop the states’ laws from going into effect, brought what is known as a pre-enforcement facial challenge. To sustain that challenge before either law is applied or interpreted, NetChoice has the burden to show that the laws have no constitutionally legitimate applications. That’s a tall order, especially when the record in both cases left it ambiguous which platforms were even covered.

The cases’ posture concerned justices across the ideological spectrum. Some, like Justices Elena Kagan and Amy Coney Barrett, seemed open to the argument that the platforms engaged in expressive conduct when they curated content for user consumption.

But they were concerned that both state laws could accomplish lawful ends such as preventing Gmail from blocking email accounts of controversial public figures such as Tucker Carlson and Rachel Maddow, a hypothetical posed by Alito. As Jackson put it: “I think that’s a problem in this case. We’re not aware of all the facts.”

Why should a facial challenge cause the justices to hesitate? In a word, federalism. Our system of vertically bifurcated sovereignty entails a presumption that state laws are valid when properly enacted. As judgments of the people’s representatives, state laws must be given effect unless and until they are shown to be patently unconstitutional.

As Solicitor General Aaron Nielson said, “Texas has a right to protect Texans.”

So, what outcome should one expect when the court rules on these cases by June? The answer is likely a remand and more litigation in the lower courts over the particulars of when and to what extent the respective state laws apply.

But the core concerns will remain alive. As today’s arguments make clear, America is at a crossroads when it comes to limiting the influence of the major tech platforms, which hold tremendous sway over everything from the content of our debates to the socialization of our children.

Under the status quo, social media platforms enjoy near-limitless discretion to suppress whatever views they wish, for any reason whatsoever.

If the Supreme Court eventually invalidates state bans on viewpoint-based discrimination, states would have far fewer options to address the most egregious censorship practices.

Because these platforms are the modern public square, their censorial conduct undermines the free speech of everyday Americans. Even worse, a full embrace of the tech companies‘ radical First Amendment theory runs the risk of entrenching censorship well beyond social media.

NetChoice seems to argue—or at least imply—that the mere act of allowing or disallowing content on a social media service is a communicative act and, therefore, is the platforms’ own protected speech.

On this theory, by merely hosting a user’s content (e.g., posts, websites, videos, electronic files), a digital service provider “participates” in that speech and is thus free to refuse service to anyone it doesn’t like.

Such a holding risks foreclosing reasonable legislative measures to prevent viewpoint discrimination against users on numerous other digital services.

For instance, Amazon has removed or reduced visibility for books from its e-commerce site. Apple has also removed apps from its App Store, web hosting services like Go Daddy have been known to deplatform websites, and email services such as Gmail may have disadvantaged certain messages based, in part, on their political content.

Applying the First Amendment in the way NetChoice proposes would undermine important protections against these and other forms of censorship.

When asked how the state laws might affect his clients once applied, Clement told the court, “We’d have to fundamentally change our business models.” To this, most Americans might say “the sooner, the better.”

Reprinted by permission from The Daily Signal, a publication of The Heritage Foundation.
Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times.
Jack Fitzhenry is a senior legal policy analyst in the Meese Center for Legal and Judicial Studies of The Heritage Foundation.
Related Topics