X gets Ninth Circuit win in case over California’s content moderation law

x bill of rights

X sued the California attorney general, challenging Assembly Bill 587 (AB 587) – a law that required large social media companies to submit semiannual reports detailing their terms of service and content moderation policies, as well as their practices for handling specific types of content such as hate speech and misinformation. X claimed that this law violated the First Amendment, was preempted by the federal Communications Decency Act, and infringed upon the Dormant Commerce Clause.

Plaintiff sought a preliminary injunction to prevent the government from enforcing AB 587 while the case was pending. Specifically, it argued that being forced to comply with the reporting requirements would compel speech in violation of the First Amendment. Plaintiff asserted that AB 587’s requirement to disclose how it defined and regulated certain categories of content compelled speech about contentious issues, infringing on its First Amendment rights.

The district court denied  plaintiff’s motion for a preliminary injunction. It found that the reporting requirements were commercial in nature and that they survived under a lower level of scrutiny applied to commercial speech regulations. Plaintiff sought review with the Ninth Circuit.

On review, the Ninth Circuit reversed the district court’s denial and granted the preliminary injunction. The court found that the reporting requirements compelled non-commercial speech and were thus subject to strict scrutiny under the First Amendment—a much higher standard. Under strict scrutiny, a law is presumed unconstitutional unless the government can show it is narrowly tailored to serve a compelling state interest. The court reasoned that plaintiff was likely to succeed on its claim that AB 587 violated the First Amendment because the law was not narrowly tailored. Less restrictive alternatives could have achieved the government’s goal of promoting transparency in social media content moderation without compelling companies to disclose their opinions on sensitive and contentious categories of speech.

The appellate court held that plaintiff would likely suffer irreparable harm if the law was enforced, as the compelled speech would infringe upon the platform’s First Amendment rights. Furthermore, the court found that the balance of equities and public interest supported granting the preliminary injunction because preventing potential constitutional violations was deemed more important than the government’s interest in transparency. Therefore, the court reversed and remanded the case, instructing the district court to enter a preliminary injunction consistent with its opinion.

X Corp. v. Bonta, 2024 WL 4033063 (9th Cir. September 4, 2024)

Supreme Court weighs in on Texas and Florida social media laws

scotus social media case

In a significant case involving the intersection of technology and constitutional law, NetChoice LLC sued Florida and Texas, challenging their social media content-moderation laws. Both states had enacted statutes regulating how platforms such as Facebook and YouTube moderate, organize, and display user-generated content. NetChoice argued that the laws violated the First Amendment by interfering with the platforms’ editorial discretion. It asked the Court to invalidate these laws as unconstitutional.

The Supreme Court reviewed conflicting rulings from two lower courts. The Eleventh Circuit had upheld a preliminary injunction against Florida’s law, finding it likely violated the First Amendment. And the Fifth Circuit had reversed an injunction against the Texas law, reasoning that content moderation did not qualify as protected speech. However, the Supreme Court vacated both decisions, directing the lower courts to reconsider the challenges with a more comprehensive analysis.

The Court explained that content moderation—decisions about which posts to display, prioritize, or suppress—constitutes expressive activity akin to editorial decisions made by newspapers. The Texas and Florida laws, by restricting this activity, directly implicated First Amendment protections. Additionally, the Court noted that these cases involved facial challenges, requiring an evaluation of whether a law’s unconstitutional applications outweigh its constitutional ones. Neither lower court had sufficiently analyzed the laws in this manner.

The Court also addressed a key issue in the Texas law: its prohibition against platforms censoring content based on viewpoint. Texas justified the law as ensuring “viewpoint neutrality,” but the Court found this rationale problematic. Forcing platforms to carry speech they deem objectionable—such as hate speech or misinformation—would alter their expressive choices and violate their First Amendment rights.

Three reasons why this case matters:

  • Clarifies Free Speech Rights in the Digital Age: The case reinforces that social media platforms have editorial rights similar to traditional media, influencing how future laws may regulate online speech.
  • Impacts State-Level Regulation: The ruling limits states’ ability to impose viewpoint neutrality mandates on private platforms, shaping the balance of power between governments and tech companies.
  • Sets a Standard for Facial Challenges: By emphasizing the need to weigh a law’s unconstitutional and constitutional applications, the decision provides guidance for courts evaluating similar cases.

Moody v. Netchoice, et al., 144 S.Ct. 2383 (July 1, 2024)

TikTok v. Garland: A full rundown of the Constitutional issues

As anticipated, TikTok and ByteDance have initiated legal action against the U.S. government, challenging a recently enacted law that would ban TikTok unless ByteDance sells the company off in the next nine months to an entity not controlled by a foreign adversary. Petitioners argue that the law infringes on constitutional rights in several ways: the First Amendment, the prohibition against bills of attainder, and the Equal Protection Clauses and Takings Clauses of the Fifth Amendment. They are seeking a declaration from the court that the law is unconstitutional and an injunction to prevent the Attorney General from enforcing the law.

TikTok tries to make itself look good

The allegations of the complaint contest any characterization of the law as a mere regulatory measure on ownership. The companies assert that compliance with the ban — especially within the 270-day timeframe — is not feasible due to commercial, technical, and legal constraints. Moreover, petitioners argue that the law represents an unconstitutional overreach, setting a dangerous standard that could allow Congress to bypass First Amendment protections under a thin guise of national security.

The complaint discusses how TikTok enjoys over 170 million monthly users in the U.S. and more than 1 billion globally. The platform is known for its powerful recommendation engine, enhancing user engagement by presenting curated content on its “For You” page. Although developed by the China-based ByteDance, TikTok operates internationally, including a significant presence in the U.S., under American law.

TikTok claims to be a repeat victim of overreach

Petitioners continue by describing how the U.S. government has previously attempted to ban TikTok while citing national security concerns. These efforts began in earnest with President Trump’s 2020 executive order, which the courts blocked for exceeding the scope of the International Emergency Economic Powers Act (IEEPA) and for constitutional issues. Although discussions aimed at resolving these security concerns led to a draft National Security Agreement under President Biden, these talks have faltered, and a resolution remains elusive.

Selling isn’t easy

Petitioners claim the requirement for TikTok to divest its U.S. operations from its global network is impractical for technological, commercial and legal reasons. A U.S.-only version of TikTok would lose access to global content, severely diminishing its appeal and commercial viability. Technologically, transferring the sophisticated source code within the law’s tight timeline is unachievable. And legal constraints, particularly China’s stringent export controls, prevent the divestiture of essential technologies like the recommendation engine.

Why TikTok thinks the law is unconstitutional

Petitioners provide four grounds on which they believe the law is unconstitutional: (1) the First Amendment, (2) Article 1’s prohibition of bills of attainder, (3) the Equal Protection Clause of the Fifth Amendment, and (4) the Takings Clauses of the Fifth Amendment

First Amendment:

Petitioners assert that the law significantly limits their First Amendment rights, impacting both the company and the free speech rights of some 170 million users in the U.S. Petitioners claim that TikTok is recognized for its editorial activities in selecting and presenting content, and argue that these activities are protected forms of expression under the First Amendment. TikTok not only curates third-party content but also creates and shares its own, especially on topics such as support for small businesses and educational initiatives, which it considers core speech. Additionally, the law restricts other ByteDance subsidiaries from reaching U.S. audiences, further stifling protected speech activities.

Petitioners argue that the court should apply strict scrutiny to the law for three reasons. First, the law imposes content- and viewpoint-based restrictions, favoring certain types of speech such as business and travel reviews, over others such as political and religious speech, and appears motivated by the viewpoints expressed on TikTok. Second, the law discriminates between speakers, specifically targeting TikTok and other ByteDance subsidiaries by automatically deeming them as foreign adversary controlled, while other companies face less stringent criteria. Third, the law constitutes an unlawful prior restraint, suppressing speech in advance by prohibiting TikTok and its users from expressing themselves on the platform, a severe infringement on First Amendment rights.

Petitioners assert that the law fails the strict scrutiny test as it neither serves a compelling government interest nor is narrowly tailored. Citing national security, Congress has not provided concrete evidence that TikTok poses a specific threat or that the law effectively addresses such threats. The speculative nature of these risks and the continued use of TikTok by officials such as President Biden and members of Congress undermine the credibility of these security concerns. Furthermore, the law lacks a fair process or sufficient evidence to justify its restrictive measures. Additionally, the law is not narrowly tailored — less restrictive measures such as an agreement involving data security protocols with TikTok were already under negotiation. These alternatives, including more targeted regulations or industry-wide data protection laws, suggest that the law’s broad prohibitions and lack of procedural fairness for TikTok are unjustified, failing to meet the precision required by strict scrutiny.

Moreover, Petitioners argue that the law independently fails strict scrutiny because it is both under- and over-inclusive. It is under-inclusive as it neglects how other foreign and domestic companies could pose similar data security risks and spread misinformation, thereby suggesting selective enforcement against certain speakers or viewpoints. Over-inclusively, it targets all ByteDance-owned applications without evidence that these pose any significant risk, covering applications irrespective of their data collection practices and only if they display content. This flawed scope suggests no direct link between the law’s restrictions and the stated security concerns, weakening its justification under strict scrutiny.

And in a way similar to the way a federal court in Montana treated that state’s TikTok ban last year, TikTok argued that the law would not even survive under a less-demanding “intermediate scrutiny” standard. This is the standard applied to content-neutral time, place, and manner restrictions. Petitioners assert that the law completely prohibits TikTok speech activities across all settings in the U.S., requiring the law to be narrowly tailored to a significant government interest and not overly restrict more speech than necessary. But Petitioners assert the government cannot show its concerns about data security and propaganda to be anything beyond speculative. Furthermore, the law does not leave open adequate alternative channels for communication, as it significantly prevents TikTok from reaching its audience. For these reasons, along with the availability of less restrictive measures, TikTok asserts the law fails intermediate scrutiny as well.

The last part of the First Amendment argument is that the law closes off an entire medium of expression, which Supreme Court precedent generally deems unreasonable. And the law is constitutionally overbroad, as it prohibits all speech on ByteDance-owned applications, regardless of content. This broad suppression encompasses substantial unconstitutional applications, far outweighing its legitimate scope, making it a clear example of overbreadth as defined in U.S. law.

Bill of Attainder:

TikTok also challenges the law as an unconstitutional bill of attainder, which is prohibited by Article I of the U.S. Constitution. This part of the Constitution forbids Congress from enacting legislation that imposes legislative punishment on specific individuals or groups without a judicial trial. Petitioners say that the law specifically targets them, imposing severe restrictions by forcing the divestment of their U.S. businesses and barring them from operating in their chosen fields, akin to punitive measures historically associated with bills of attainder. Unlike other entities that can avoid similar prohibitions through less restrictive measures, petitioners face unique, punitive burdens without meaningful opportunities for corrective action, thus violating the separation of powers by allowing legislative encroachment on judicial functions. Additionally, petitioners assert that the law disproportionately impacts them by not applying the same standards to similarly situated companies, making it effectively a punitive measure against a specific corporate group, thereby rendering it a bill of attainder.

Equal Protection:

Petitioners’ third constitutional argument is that the law violates their rights under the equal protection component of the Fifth Amendment’s Due Process Clause by discriminatorily targeting them without justification. Unlike other companies deemed “controlled by a foreign adversary,” petitioners are automatically classified as such without the due process of notice and a presidential determination supported by evidence, which other companies receive. This classification imposes undue burdens on their free speech rights by bypassing the necessary procedural safeguards that other entities are afforded, such as detailed justifications for national security concerns that enable judicial review. Additionally, the law exempts other similarly situated companies from certain restrictions if they offer applications for posting reviews, unjustifiably leaving petitioners without similar exemptions. This differential treatment lacks a rational basis, undermining the equal protection principles by imposing arbitrary and discriminatory restrictions on petitioners.

Takings Clause:

And petioners’ fourth constitutional argument is that the law effects an unlawful taking of private property without just compensation, in violation of the Fifth Amendment’s Takings Clause. The law mandates the shutdown of ByteDance’s U.S. operations or forces the sale of these assets under conditions that do not assure fair market value, severely undercutting their worth. This compulsion to sell or close down constitutes a per se taking, as it strips ByteDance of all economically beneficial uses of its property without just compensation. Furthermore, the law also represents a regulatory taking by significantly impacting the economic value of ByteDance’s investments and interfering with reasonable investment-backed expectations. The legislative action here goes beyond permissible bounds, triggering a need for regulatory scrutiny under established criteria such as economic impact, disruption of investment expectations, and the nature of government action. As such, petitioners argue that the law unjustly deprives them of their property rights without adequate compensation, necessitating prospective injunctive relief.

Petitioners seek a judicial declaration that the law is unconstitutional and an injunction against its enforcement, arguing that the government’s measures are excessively punitive and not grounded in adequately demonstrated national security risks.

TikTok’s constitutional arguments against the ban: a first look

tiktok constitution

As expected, TikTok has sued the federal government over the law enacted last month that requires ByteDance to sell off the app or be banned. It seeks a declaratory judgment that the law is unconstitutional and asks for an injunction barring the law’s enforcement. Here’s a first look at the constitutional issues TikTok is raising:

  • First Amendment: TikTok contends the Act restricts its right to free speech more severely than other media entities without sufficient justification, failing to consider less restrictive alternatives. The ban also violates the free speech rights of the app’s 170 million American users.
  • Bill of Attainder: TikTok asserts that the Act singles out TikTok for punitive measures typically reserved for judicial processes, without due process.
  • Equal Protection: Under the Fifth Amendment, TikTok argues the Act wrongfully applies stricter conditions on it than on other similar entities.
  • Takings Clause: TikTok claims the Act effects an unlawful taking of its property without just compensation, as it forces a sale or shutdown of its U.S. operations at undervalued prices.

More analysis to come.

TikTok and the First Amendment: Previewing some of the free speech issues

TikTok is on the verge of a potential federal ban in the United States. This development echoes a previous situation in Montana, where a 2023 state law attempted to ban TikTok but faced legal challenges. TikTok and its users filed a lawsuit against the state, claiming the ban violated their First Amendment rights. The federal court sided with TikTok and the users, blocking the Montana law from being enforced on the grounds that it infringed on free speech.

The court’s decision highlighted that the law restricted TikTok users’ ability to communicate and impacted the company’s content decisions, thus failing to meet the intermediate scrutiny standard applicable to content-neutral speech restrictions. The ruling criticized the state’s attempt to regulate national security, deeming it outside the state’s jurisdiction and excessively restrictive compared to other available measures such as data privacy laws. Furthermore, the court noted that the ban left other similar apps unaffected and failed to provide alternative communication channels for TikTok users reliant on the app’s unique features.

TikTok is now officially among the walking dead

It’s now the law of the land that come nine months from now, if any of the app stores make TikTok available or if any hosting provider lends services enabling TikTok, those companies will face substantial penalties.That is, unless TikTok’s owner ByteDance sells off the company to an entity that is not located in or controlled by anyone from Russia, Iran, North Korea or China.The version of the law that the President signed on April 24, 2024 is pretty much the same as the one the House of Representatives passed in March 2024.

The only difference is that if in nine months there is a transaction underway to sell off TikTok, the President can grant one 90-day extension for the sale to be completed.

No doubt we’re going to see some serious free speech litigation over this. Stay tuned.

What does the “bill that could ban TikTok” actually say?

In addition to causing free speech concerns, the bill is troubling in the way it gives unchecked power to the Executive Branch.

Earlier this week the United States House of Representatives passed a bill that is being characterized as one that could ban TikTok. Styled as the Protecting Americans from Foreign Adversary Controlled Applications Act, the text of the bill calls TikTok and its owner ByteDance Ltd. by name and seeks to “protect the national security of the United States from the threat posed by foreign adversary controlled applications.”

What conduct would be prohibited?

The Act would make it unlawful for anyone to “distribute, maintain, or update” a “foreign adversary controlled application” within the United States. The Act specifically prohibits anyone from “carrying out” any such distribution, maintenance or updating via a “marketplace” (e.g., any app store) or by providing hosting services that would enable distribution, maintenance or updating of such an app. Interestingly, the ban does not so much directly prohibit ByteDance from making TikTok available, but would cause entities such as Apple and Google to be liable for making the app available for others to access, maintain and update the app.

What apps would be banned?

There are two ways one could find itself being a “foreign adversary controlled application” and thereby prohibited.

  • The first is simply by being TikTok or any app provided by ByteDance or its successors.
  • The second way – and perhaps the more concerning way because of its grant of great power to one person – is by being a “foreign adversary controlled application” that is “determined by the President to present a significant threat to the national security of the United States.” Though the President must first provide the public with notice of such determination and make a report to Congress on the specific national security concerns, there is ultimately no check on the President’s power to make this determination. For example, there is no provision in the statute saying that Congress could override the President’s determination.

Relatively insignificant apps, or apps with no social media component would not be covered by the ban. For example, to be a “covered company” under the statute, the app has to have more than one million monthly users in two of the three months prior to the time the President determines the app should be banned. And the statute specifically says that any site having a “primary purpose” of allowing users to post reviews is exempt from the ban.

When would the ban take effect?

TikTok would be banned 180 days after the date the President signs the bill. For any other app that the President would later decide to be a “foreign adversary controlled application,” it would be banned 180 days after the date the President makes that determination. The date of that determination would be after the public notice period and report to Congress discussed above.

What could TikTok do to avoid being banned?

It could undertake a “qualified divestiture” before the ban takes effect, i.e., within 180 days after the President signs the bill. Here is another point where one may be concerned about the great power given to the Executive Branch. A “qualified divestiture” would be situation in which the owner of the app sells off that portion of the business *and* the President determines two things: (1) that the app is no longer being controlled by a foreign adversary, and (2) there is no “operational relationship” between the United States operations of the company and the old company located in the foreign adversary company. In other words, the app could not avoid the ban by being owned by a United States entity but still share data with the foreign company and have the foreign company handle the algorithm.

What about users who would lose all their data?

The Act provides that the app being prohibited must provide users with “all the available data related to the account of such user,” if the user requests it, prior to the time the app becomes prohibited. That data would include all posts, photos and videos.

What penalties apply for violating the law?

The Attorney General is responsible for enforcing the law. (An individual could not sue and recover damages.) Anyone (most likely an app store) that violates the ban on distributing, maintaining or updating the app would face penalties of $5,000 x the number of users determined to access, maintain or update the app. Those damages could be astronomical – TikTok currently has 170 million users, so the damages would be $850,000,000,000. An app’s failure to provide data portability prior to being banned would cause it to be liable for $500 x the number of affected users.

How did Ohio’s efforts to regulate children’s access to social media violate the constitution?

children social media

Ohio passed a law called the Parental Notification by Social Media Operators Act which sought to require certain categories of online services to obtain parental consent before allowing any unemancipated child under the age of sixteen to register or create accounts with the service.

Plaintiff internet trade association – representing platforms including Google, Meta, X, Nextdoor, and Pinterest – sought a preliminary injunction that would prohibit the State’s attorney general from enforcing the law. Finding the law to be unconstitutional, the court granted the preliminary injunction.

Likelihood of success on the merits: First Amendment Free Speech

The court found that plaintiff was likely to succeed on its constitutional claims. Rejecting the State’s argument that the law sought only to regulate commerce (i.e., the contracts governing use of social media platforms) and not speech, it held that the statute was a restriction on speech, implicating the First Amendment. It held that the law was a content-based restriction because the social media features the statute singled out in defining which platforms were subject to the law – e.g., the ability to interact socially with others – were “inextricable from the content produced by those features.” And the law violated the rights of minors living in Ohio because it infringed on minors’ rights to both access and produce First Amendment protected speech.

Given these attributes of the law, the court applied strict scrutiny to the statute. The court held that the statute failed to pass strict scrutiny for several reasons. First, the Act was not narrowly tailored to address the specific harms identified by the State, such as protecting minors from oppressive contract terms with social media platforms. Instead of targeting the contract terms directly, the Act broadly regulated access to and dissemination of speech, making it under-inclusive in addressing the specific issue of contract terms and over-inclusive by imposing sweeping restrictions on speech. Second, while the State aimed to protect minors from mental health issues and sexual predation related to social media use, the Act’s approach of requiring parental consent for minors under sixteen to access all covered websites was an untargeted and blunt instrument, failing to directly address the nuanced risks posed by specific features of social media platforms. Finally, in attempting to bolster parental authority, the Act mirrored previously rejected arguments that imposing speech restrictions, subject to parental veto, was a legitimate means of aiding parental control, making it over-inclusive by enforcing broad speech restrictions rather than focusing on the interests of genuinely concerned parents.

Likelihood of success on the merits: Fourteenth Amendment Due Process

The statute violated the Due Process Clause of the Fourteenth Amendment because its vague language failed to provide clear notice to operators of online services about the conduct that was forbidden or required. The Act’s broad and undefined criteria for determining applicable websites, such as targeting children or being reasonably anticipated to be accessed by children, left operators uncertain about their legal obligations. The inclusion of an eleven-factor list intended to clarify applicability, which contained vague and subjective elements like “design elements” and “language,” further contributed to the lack of precise guidance. The Act’s exception for “established” and “widely recognized” media outlets without clear definitions for these terms introduced additional ambiguity, risking arbitrary enforcement. Despite the State highlighting less vague aspects of the Act and drawing parallels with the federal Children Online Privacy Protection Act of 1998 (COPPA), these did not alleviate the overall vagueness, particularly with the Act’s broad and subjective exceptions.

Irreparable harm and balancing of the equities

The court found that plaintiff’s members would face irreparable harm through non-recoverable compliance costs and the potential for civil liability if the Act were enforced, as these monetary harms could not be fully compensated. Moreover, the Act’s infringement on constitutional rights, including those protected under the First Amendment, constituted irreparable harm since the loss of such freedoms, even for short durations, is considered significant.

The balance of equities and the public interest did not favor enforcing a statute that potentially violated constitutional principles, as the enforcement of unconstitutional laws serves no legitimate public interest. The argument that the Act aimed to protect minors did not outweigh the importance of upholding constitutional rights, especially when the statute’s measures were not narrowly tailored to address specific harms. Therefore, the potential harm to plaintiff’s members and the broader implications for constitutional rights underscored the lack of public interest in enforcing this statute.

NetChoice, LLC v. Yost, 2024 WL 55904 (S.D. Ohio, February 12, 2024)

See also: 

California court decision strengthens Facebook’s ability to deplatform its users

vaccine information censorship

Plaintiff used Facebook to advertise his business. Facebook kicked him off and would not let him advertise, based on alleged violations of Facebook’s Terms of Service. Plaintiff sued for breach of contract. The lower court dismissed the case so plaintiff sought review with the California appellate court. That court affirmed the dismissal.

The Terms of Service authorized the company to unilaterally “suspend or permanently disable access” to a user’s account if the company determined the user “clearly, seriously, or repeatedly breached” the company’s terms, policies, or community standards.

An ordinary reading of such a provision would lead one to think that Facebook would not be able to terminate an account unless certain conditions were met, namely, that there had been a clear, serious or repeated breach by the user. In other words, Facebook would be required to make such a finding before terminating the account.

But the court applied the provision much more broadly. So broadly, in fact, that one could say the notion of clear, serious, or repeated breach was irrelevant, superfluous language in the terms.

The court said: “Courts have held these terms impose no ‘affirmative obligations’ on the company.” Discussing a similar case involving Twitter’s terms of service, the court observed that platform was authorized to suspend or terminate accounts “for any or no reason.” Then the court noted that “[t]he same is true here.”

So, the court arrived at the conclusion that despite Facebook’s own terms – which would lead users to think that they wouldn’t be suspended unless there was a clear, serious or repeated breach – one can get deplatformed for any reason or no reason. The decision pretty much gives Facebook unmitigated free speech police powers.

Strachan v. Facebook, Inc., 2023 WL 8589937 (Cal. App. December 12, 2023)

Can YouTube be sued for censorship? A court weighs in.

Prager University sued Google LLC and YouTube, LLC, alleging that defendants discriminated against plaintiff’s conservative political viewpoints by restricting its videos on YouTube. Plaintiff asked the court to issue a preliminary injunction to prevent defendants from continuing these practices and to allow plaintiff’s videos unrestricted access on the platform. Plaintiff also sought damages for alleged violations of free speech rights and other claims.

The court decided in favor of defendants. It dismissed plaintiff’s federal claims under the First Amendment and the Lanham Act and declined to exercise jurisdiction over the state law claims. Additionally, the court denied plaintiff’s motion for a preliminary injunction.

The court ruled that defendants, as private entities, were not state actors and therefore not bound by the First Amendment. It found that YouTube’s platform, even if widely used for public discourse, does not transform it into a public forum subject to constitutional free speech protections. Regarding the Lanham Act, the court concluded that statements about YouTube being a platform for free expression were non-actionable “puffery” and not specific enough to be considered false advertising.

In dismissing plaintiff’s state law claims, the court noted that they raised complex issues of California law better suited for state courts. This decision left open the possibility for plaintiff to amend its complaint or pursue claims in state court.

Three reasons why this case matters:

  • Clarification of First Amendment Limits: The ruling reinforces that constitutional free speech protections apply only to government actors, not private companies.
  • Role of Platforms in Content Moderation: The case highlights ongoing debates about the responsibilities of tech companies in regulating content and their impact on public discourse.
  • Defining Puffery vs. Advertising: The court’s finding that statements about neutrality were mere puffery provides insight into how courts assess claims of false advertising.

Prager University v. Google LLC, 2018 WL 1471939 (N.D. Cal. March 26, 2018)

Scroll to top