In a significant case involving the intersection of technology and constitutional law, NetChoice LLC sued Florida and Texas, challenging their social media content-moderation laws. Both states had enacted statutes regulating how platforms such as Facebook and YouTube moderate, organize, and display user-generated content. NetChoice argued that the laws violated the First Amendment by interfering with the platforms’ editorial discretion. It asked the Court to invalidate these laws as unconstitutional.
The Supreme Court reviewed conflicting rulings from two lower courts. The Eleventh Circuit had upheld a preliminary injunction against Florida’s law, finding it likely violated the First Amendment. And the Fifth Circuit had reversed an injunction against the Texas law, reasoning that content moderation did not qualify as protected speech. However, the Supreme Court vacated both decisions, directing the lower courts to reconsider the challenges with a more comprehensive analysis.
The Court explained that content moderation—decisions about which posts to display, prioritize, or suppress—constitutes expressive activity akin to editorial decisions made by newspapers. The Texas and Florida laws, by restricting this activity, directly implicated First Amendment protections. Additionally, the Court noted that these cases involved facial challenges, requiring an evaluation of whether a law’s unconstitutional applications outweigh its constitutional ones. Neither lower court had sufficiently analyzed the laws in this manner.
The Court also addressed a key issue in the Texas law: its prohibition against platforms censoring content based on viewpoint. Texas justified the law as ensuring “viewpoint neutrality,” but the Court found this rationale problematic. Forcing platforms to carry speech they deem objectionable—such as hate speech or misinformation—would alter their expressive choices and violate their First Amendment rights.
Three reasons why this case matters:
- Clarifies Free Speech Rights in the Digital Age: The case reinforces that social media platforms have editorial rights similar to traditional media, influencing how future laws may regulate online speech.
- Impacts State-Level Regulation: The ruling limits states’ ability to impose viewpoint neutrality mandates on private platforms, shaping the balance of power between governments and tech companies.
- Sets a Standard for Facial Challenges: By emphasizing the need to weigh a law’s unconstitutional and constitutional applications, the decision provides guidance for courts evaluating similar cases.
Moody v. Netchoice, et al., 144 S.Ct. 2383 (July 1, 2024)