Murthy v. Missouri: SCOTUS case plays pivotal role in election integrity and national security
By Jamie Neikrie
On Monday, March 18, the Supreme Court heard oral arguments in Murthy v. Missouri. This case addresses the role that government officials can play in communicating with social media companies in the development and implementation of content moderation policies. It follows a sweeping decision by the Fifth Circuit Court which ruled that the Biden Administration violated the First Amendment rights of the plaintiffs (the states of Missouri and Louisiana and five individual plaintiffs) by encouraging social media companies to remove content and users that shared false information about the election and COVID-19. In the oral arguments, a majority of the justices seemed skeptical of this position, and expressed concerns with its broad implications for a wide range of government speech, actions, and partnerships — a sign that SCOTUS recognizes the importance of these partnerships for public safety, national security, and election integrity.
In today’s world, the health of democracies relies heavily on the state of the information ecosystem in which they operate. In the last decade, false information about elections, public officials, and democratic systems have proliferated on social media platforms in the U.S., driven in part by influence operations from foreign adversaries like Russia, China, Iran, and North Korea. Pervasive and unchecked false information, conspiracy theories, and foreign propaganda are already eroding confidence in American democracy. Distrust, in turn, can lead to apathy, polarization, or even violence, as we witnessed on January 6, 2021.
The challenge to maintain the integrity of our information ecosystem, especially around elections, does not fall on any one actor. Social media platforms, as the primary driver of false information, have a key role to play. But the burden also falls on the American intelligence community, independent researchers, and election officials who are working to hold safe and secure elections in the face of emerging threats. The integrity of American democracy depends on open lines of communication between these stakeholders to readily share information, identify emerging threats and cybersecurity vulnerabilities, and work together to disseminate accurate information.
This coordination — and where it crosses the line into coercion or intimidation of private actors by the federal government — is the core question at stake in Murthy v. Missouri. In October, when SCOTUS stayed the Fifth Circuit Court’s modified injunction, Justices Gorsuch, Thomas, and Alito signaled agreement with the lower courts. Thomas issued a dissent in that opinion, referring to the lower court’s “extensive findings of fact.” In oral arguments, Justice Alito expressed concern with the volume of communications between the Biden Administration and the platforms, as well as the tone of some of these messages. “I cannot imagine federal officials talking like that to the print media,” Justice Alito said. “It is treating Facebook and these other platforms like they’re subordinates.”
The remaining justices, however, expressed skepticism with the far-reaching implications of the arguments made by the plaintiffs. They raised countless examples of government actions in an attempt to understand exactly where the line is between acceptable communication and intimidation. Some examples were hypothetical, drawing on areas of national security, disaster preparedness, or public health. Justice Jackson mentioned an administration official flagging a dangerous, viral online challenge, for example. Others were very real, in the case of interference by foreign adversaries in American elections, or even the justices’ own interactions with reporters.
Multiple justices expressed the worry that regular government information sharing and guidance would be prohibited by the plaintiff’s “extremely expansive argument,” in the words of Justice Kagan. Or as Justice Coney Barrett put it, the plaintiff’s interpretation of intimidation would “sweep in an awful lot.”
The oral arguments surfaced two other crucial points, which help us understand whether the government’s action amounted to coercion or intimidation.
Did the platforms retain the ultimate decision about whether and how to act on information that officials provided, even information identified as false? “What do you do with the fact that the platforms say no all the time to the government?” Justice Kavanaugh asked the plaintiffs. Justices Kagan and Sotomayor, in particular, questioned the plaintiffs’ ability to prove that a content moderation decision by any platform was really induced by the government, rather than social media companies enforcing their own policies.
Did the platforms initiate partnerships with government officials and law enforcement? While Justice Alito presented the Biden Administration’s repeated exhortations of “a partnership” as crossing the line, the administration's defense pointed out that, in many cases, the platforms themselves initiated these partnerships. Ahead of the 2018 election, for example, Facebook (now Meta) convened a meeting with representatives from the biggest players in the technology industry along with the Federal Bureau of Investigation (FBI) and Department of Homeland Security (DHS) officials. The purpose of the meeting was to develop closer ties between the social media platforms and law enforcement to prevent abuse of social platforms, including warding off meddling by malicious actors like Russia. Then, ahead of the 2020 and 2022 elections, Facebook launched its Voting Information Center. As part of this nonpartisan effort, Meta worked with state election officials and nonpartisan civic organizations to direct users to and ensure its center is updated with the latest, accurate election information in each state.
Issue One operates the Faces of Democracy program, a nonpartisan effort that brings together election officials from across the country to help strengthen U.S. elections and critical election infrastructure. These workers operate on the frontlines of democracy, and have consistently expressed how important active communication and partnerships with the platforms are for providing accurate voting information, correcting false content, and addressing violent threats. This collaboration is essential for the integrity of elections and the safety of election workers and voters, especially given the decentralized nature of U.S. elections and the inevitability of unforeseen contingencies like severe weather, ballot shortages, and technological issues (or an unprecedented pandemic).
Yesterday’s oral arguments signal that SCOTUS recognizes the importance of these partnerships for public and civic health. But the court could still redefine the parameters of government coercion, which would have lasting ramifications for election integrity and online content moderation. Regardless of how the court rules over the summer, this case — and its ability to reach the Supreme Court in the first place — points to a much broader effort to undermine the infrastructure and private partnerships that were stood up after 2016 to address the spread of false information online.
The impact of these efforts is already being felt in 2024, an historic year for elections around the world. The FBI, Cybersecurity and Infrastructure Security Agency (CISA), and DHS all have limited communications with technology companies and election administrators about foreign interference threats. On the other side, many of the social media companies have dramatically downsized their trust and safety teams responsible for election integrity and rolled back crucial policies around election misinformation.
These decisions have left a vacuum that foreign adversaries like Russia, China, and Iran will flood with false information that seeks to confuse, manipulate, and divide American voters, as they have in previous elections. Murthy v. Missouri makes it clear — for the future of our democracy, we need to invest more, not less, in the partnerships and infrastructure that help create healthy information ecosystems.