Mark Changizi, Daniel Kotzin, and Michael Senger were frequent Twitter users who often wrote with skepticism or criticism of the government’s approach to COVID-19, such as questioning the effectiveness of masks or vaccines. At various times, they had posts removed by Twitter for allegedly violating its misinformation policy. Eventually, they were each banned or suspended from Twitter altogether.
All three have sued, but they aren’t suing Twitter to be reinstated. Instead, they’re suing various federal government agencies and officials. They allege that Twitter was coerced by multiple formal government actions, such as the Department of Health and Human Services sending Twitter a “Request for Information” on COVID-19 disinformation. They also allege that Twitter was illegally influenced by federal officials’ various public statements, such as President Biden’s remark that social media companies are “killing people” by allowing misinformation. In their lawsuit, the plaintiffs are asking that the federal officials be prevented from making similar statements in the future and that the formal actions be revoked.
The district court rejected the plaintiffs’ argument and dismissed the case for lack of standing. The court found there was no evidence that the federal actions and statements had caused Twitter to deplatform the plaintiffs, given that Twitter had removed others for disinformation before any of the government actions at issue. The court also suggested that the plaintiffs’ injury could not be remedied by a court order, since they are not suing Twitter and would not necessarily be reinstated if they win. The plaintiffs have now appealed to the Sixth Circuit where NetChoice, joined by the Cato Institute and the Pelican Institute, has filed an amicus brief supporting neither party.
In our brief, we focus not on who should win this particular case, but rather on the larger principles at stake in how the Sixth Circuit decides this case. First, we urge the court to acknowledge that government pressure against speakers or platforms, known as “jawboning,” raises serious First Amendment concerns. All too often, government agencies and officials attempt to suppress speech informally when they could not legally do so formally. At a certain point, as the Supreme Court held in Bantam Books v. Sullivan (1963), such pressure becomes so coercive as to violate the First Amendment.
Whether applied to old media like book publishers or new media like online platforms, such pressure should be discouraged, and courts should vigorously police its permissible boundaries. And contrary to the district court’s suggestion, a request to stop government pressure asks for a sufficient remedy, whether or not the absence of pressure would change the platform’s behavior. Jawboning lawsuits vindicate not the right to be platformed, but rather the right to have platforming decisions be made in the absence of government coercion.
At the same time, courts should also make clear that such pressure does not transform platforms like social media companies into “state actors.” Publishers and social media networks do not become the government even when they fall under intense pressure from the government. They are unlike private “company towns” that function like local governments for all intents and purposes. Checks upon government discretion, like the First Amendment, do not apply to private entities like social media platforms. Courts should thus make clear, when reviewing jawboning claims, that the government itself is the proper defendant, not the private platforms. If courts instead erroneously allowed suits to be brought against private platforms to enforce public First Amendment standards, it would ironically lead to a loss of the private speech rights of those platforms to choose what speech they publish.
Jawboning cases like this one are likely to reach the courts with increasing frequency in the social media age. When reviewing these cases, courts should be careful to strike an approach that protects the speech rights of both social media platforms and their users.