Claims against porn sites dismissed because of Section 230 immunity

Plaintiffs sued several internet pornography companies after they discovered that videos secretly recorded of them while changing in a college locker room had been uploaded online.
Plaintiffs asked the court to hold the defendants liable under several theories, including civil conspiracy, negligent monitoring, and violations of the Trafficking Victims Protection Reauthorization Act (TVPRA).
The court granted summary judgment in favor of defendants.
The court held that Section 230 of the Communications Decency Act shielded the defendants from liability for user-generated content, and plaintiffs failed to show that any of defendants materially contributed to the illegal aspects of the videos. The court also found no evidence of a conspiracy or that defendants met the requirements to be considered beneficiaries of a sex trafficking venture under the TVPRA. Claims against defendants who merely licensed trademarks or placed ads were also rejected due to lack of personal jurisdiction or insufficient evidence of wrongdoing.
Jane Does 1–9 v. Collins Murphy, et al., No. 7:20-cv-00947-DCC, 2025 WL 2533961 (D.S.C. Sept. 3, 2025).
Court gives X opportunity to raise Section 230 claim in deepfake case

X sued Minnesota Attorney General Keith Ellison over a state law that prohibits the dissemination of AI-generated political deepfakes, arguing the statute violates the First and Fourteenth Amendments and is preempted by the Communications Decency Act at 47 U.S.C. 230. A related case challenging the same law is already on appeal in Kohls v. Ellison, leading the court to stay X’s constitutional claims while allowing its Section 230 claim to move forward. The court invited both parties to file motions for judgment on the pleadings within 30 days. If neither does so, the entire case will be stayed pending resolution of the Kohls appeal.
X Corp. v. Ellison, 2025 WL 1833455 (D. Minn. July 3, 2025)
Content moderation lapses did not make hookup app liable for misrepresentation

App’s general statement that it would provide a “safe and secure environment” did not amount to a promise for which plaintiff could assert Barnes-style misrepresentation and thereby avoid the app’s Section 230 immunity.
Plaintiff – an underage user – sued Grindr based on injuries he suffered from meeting up with four different men with whom he had connected on the platform. One of the claims plaintiff brought was for negligent misrepresentation. Defendant stated on the app that it was “designed to create a safe and secure environment for its users,” and plaintiff alleged that defendant failed to do so.
Defendant moved to dismiss this claim under 47 U.S.C. §230, which provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). The district court granted the motion and plaintiff sought review with the Ninth Circuit.
On appeal, the Ninth Circuit affirmed the dismissal under Section 230. In certain situations, a promise by an online platform to do something can form the basis of a claim against the platform that will not be barred by Section 230 immunity. For example, in Barnes v. Yahoo!, Inc., 570 F.3d 1096 (9th Cir. 2009), Section 230 did not protect Yahoo against a claim that it failed – despite its promise to do so – to take down indecent profiles impersonating the plaintiff in that case. And in Estate of Bride v. Yolo Technologies, Inc., 112 F. 4th 1168 (9th Cir. 2024), plaintiffs’ negligent misrepresentation claims were not subject to Section 230 immunity where the platform promised to unmask anonymous harassing users but failed to do so.
In this case, however, the Court saw the situation differently than it did in either Barnes or Estate of Bride. In those cases, plaintiffs were seeking to hold defendants liable for specific promises or representations. In this case, by contrast, Grindr’s general statement that its app was designed to create a safe and secure environment was a description of its moderation policy and thus protected from liability under Section 230.
Doe v. Grindr, Inc., 128 F.4th 1148 (9th Cir. February 18, 2025)
Section 230 protected Meta from Huckabee cannabis lawsuit

Mike Huckabee, the former governor of Arkansas, sued Meta Platforms, Inc., the parent company of Facebook, for using his name and likeness without his permission in advertisements for CBD products. Huckabee argued that these ads falsely claimed he endorsed the products and made misleading statements about his personal health. He asked the court to hold Meta accountable under various legal theories, including violation of his publicity rights and privacy.
Plaintiff alleged that defendant approved and maintained advertisements that misappropriated plaintiff’s name, image, and likeness. Plaintiff further claimed that the ads placed plaintiff in a false light by attributing statements and endorsements to him that he never made. Additionally, plaintiff argued that defendant had been unjustly enriched by profiting from these misleading ads. Defendant, however, sought to dismiss the claims, relying on the Communications Decency Act at 47 U.S.C. 230, which grants immunity to platforms for third-party content.
The court granted Meta’s motion to dismiss. It determined that Section 230 shielded defendant from liability for the third-party content at issue. The court also noted that plaintiff’s allegations lacked the specificity needed to overcome the protections provided by Section 230. Furthermore, the court emphasized that federal law, such as Section 230, preempts conflicting state laws, such as Arkansas’s Frank Broyles Publicity Protection Act.
Three reasons why this case matters:
- Defines Section 230 Protections: It reaffirms the broad immunity tech companies enjoy under Section 230, even in cases involving misuse of publicity rights.
- Digital Rights and Privacy: The case highlights the tension between protecting individual rights and maintaining the free flow of online content.
- Challenges for State Laws: It shows how federal law can preempt state-specific protections, leaving individuals with limited recourse.
Mike Huckabee v. Meta Platforms, Inc., 2024 WL 4817657 (D. Del. Nov. 18, 2024)
Disabled veteran’s $77 billion lawsuit against Amazon dismissed

A disabled Army veteran sued Amazon alleging “cyberstalking” and “cyberbullying” on its gaming platform, New World. Plaintiff claimed Amazon allowed other players and employees to engage in harassment, culminating in his being banned from the platform after over 10,000 hours and $1,700 of investment. Plaintiff sought $7 billion in compensatory damages and $70 billion in punitive damages, asserting claims for intentional infliction of emotional distress, gross negligence, and unfair business practices. Plaintiff also filed motions for a preliminary injunction to reinstate his gaming account and to remand the case to state court.
The court, however, dismissed the case. It granted plaintiff in forma pauperis status, allowing him to proceed without paying court fees, but ruled that his complaint failed to state any claim upon which relief could be granted. The court found no grounds for allowing plaintiff to amend the complaint, as any amendment would be futile.
The court dismissed the case on several legal principles. First, it found that Amazon was immune from liability under the Communications Decency Act at 47 U.S.C. §230 for any content posted by third-party users on the New World platform. Section 230 protects providers of interactive computer services from being treated as publishers or speakers of user-generated content, even if they moderate or fail to moderate that content.
Second, plaintiff’s claims about Amazon employees’ conduct were legally insufficient. His allegations, such as complaints about bad customer service and being banned from the platform, failed to meet the standard for intentional infliction of emotional distress, which requires conduct so outrageous it exceeds all bounds tolerated in a civilized society. Similarly, plaintiff’s gross negligence claims did not demonstrate any extreme departure from reasonable conduct.
Finally, in the court’s view, plaintiff’s claim under California’s Unfair Competition Law (UCL) lacked the necessary specificity. The court found that poor customer service and banning a user from a platform did not constitute unlawful, unfair, or fraudulent business practices under the UCL.
Three Reasons Why This Case Matters
- Clarifies Section 230 Protections: The case reinforces the broad immunity granted to online platforms for third-party content under Section 230, even when moderation decisions are involved.
- Defines the Limits of Tort Law in Online Interactions: It highlights the high bar plaintiffs must meet to succeed on claims such as intentional infliction of emotional distress and gross negligence in digital contexts.
- Sets Guidance for Gaming Platform Disputes: The decision underscores the limited liability of companies for banning users or providing subpar customer support, offering guidance for similar lawsuits.
Haymore v. Amazon.com, Inc., 2024 WL 4825253 (E.D. Cal., Nov. 19, 2024)
Section 230 saves eBay from liability for violation of environmental laws

The United States government sued eBay for alleged violations of environmental regulations, claiming the online marketplace facilitated the sale of prohibited products in violation of the Clean Air Act (CAA), the Toxic Substances Control Act (TSCA), and the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA). According to the government’s complaint, eBay allowed third-party sellers to list and distribute items that violated these statutes, including devices that tamper with vehicle emissions controls, products containing methylene chloride used in paint removal, and unregistered pesticides.
eBay moved to dismiss, arguing that the government had failed to adequately state a claim under the CAA, TSCA, and FIFRA, and further contended that eBay was shielded from liability under Section 230 of the Communications Decency Act (CDA), 47 U.S.C. 230(c).
The court granted eBay’s motion to dismiss. It held that eBay was immune from liability because of Section 230, which protects online platforms in most situations from being held liable as publishers of third-party content. The court determined that, as a marketplace, eBay did not “sell” or “offer for sale” the products in question in the sense required by the environmental statutes, since it did not possess, own, or transfer title of the items listed by third-party sellers.
The court found that Section 230 provided broad immunity for eBay’s role as an online platform, preventing it from being treated as the “publisher or speaker” of content provided by its users. As the government sought to impose liability based on eBay’s role in hosting third-party listings, the court concluded that the claims were barred under the CDA.
United States of America v. eBay Inc., 2024 WL 4350523 (E.D.N.Y. September 30, 2024)
No Section 230 immunity for Facebook on contract-related claims

Plaintiffs sued Meta, claiming that they were harmed by fraudulent third-party ads posted on Facebook. Plaintiffs argued that these ads violated Meta’s own terms of service, which prohibits deceptive advertisements. They accused Meta of allowing scammers to run ads that targeted vulnerable users and of prioritizing revenue over user safety. Meta moved to dismiss claiming that it was immune from liability under 47 U.S.C. § 230(c)(1) (a portion of the Communications Decency Act (CDA)), which generally protects internet platforms from being held responsible for third-party content.
Plaintiffs asked the district court to hold Meta accountable for five claims: negligence, breach of contract, breach of the covenant of good faith and fair dealing, violation of California’s Unfair Competition Law (UCL), and unjust enrichment. They alleged that Meta not only failed to remove scam ads but actively solicited them, particularly from advertisers based in China, who accounted for a large portion of the fraudulent activity on the platform.
The district court held that § 230(c)(1) protected Meta from all claims, even the contract claims. Plaintiffs sought review with the Ninth Circuit.
On appeal, the Ninth Circuit affirmed that § 230(c)(1) provided Meta with immunity for the non-contract claims, such as negligence and UCL violations, because these claims treated Meta as a publisher of third-party ads. But the Ninth Circuit disagreed with the district court’s ruling on the contract-related claims. It held that the lower court had applied the wrong legal standard when deciding whether § 230(c)(1) barred those claims. So the court vacated the dismissal of the contract claims, explaining that contract claims were different because they arose from Meta’s promises to users, not from its role as a publisher. The case was remanded back to the district court to apply the correct standard for the contract claims.
Three reasons why this case matters:
- It clarifies that § 230(c)(1) of the CDA does not provide blanket immunity for all types of claims, especially contract-related claims.
- The case underscores the importance of holding internet companies accountable for their contractual promises to users, even when they enjoy broad protections for third-party content.
- It shows that courts continue to wrestle with the boundaries of platform immunity under the CDA, which could shape future rulings about online platforms’ responsibilities.
Calise v. Meta Platforms, Inc., 103 F.4th 732 (9th Cir., June 4, 2024)
Section 230 protected President Trump from defamation liability

Plaintiff sued the Trump campaign, some of the President’s advisors and several conservative media outlets asserting claims for defamation. Plaintiff – an employee of voting systems maker Dominion – claimed defendants slandered him by saying plaintiff had said he was going to make sure Trump would not win the 2020 election.
The Trump campaign had argued that two retweets – one by Donald Trump and another by his son Eric – could not form the basis for liability because Section 230 shielded the two from liability. The lower court rejected the Section 230 argument. But on review, the Colorado Court of Appeals held that Section 230 immunity should apply to these retweets.
Section 230 shields users of interactive computer services from liability arising from information provided by third parties. The facts of the case showed that both President Trump and Eric Trump simply retweeted a Gateway Pundit article and an One America Network article without adding any new defamatory content.
The court specifically rejected plaintiff’s argument that Section 230 immunity should not apply because of the Trump defendants’ knowledge that the retweeted information was defamatory. The court looked to a broader consensus of courts that hold such an idea is not woven into Section 230 imm.
The case supports the proposition that defendants could repost verbatim content that someone else generated – even with knowledge that the content is defamatory – and not face liability.
Coomer v. Donald J. Trump for President, Inc., — P.3d —, 2024 WL 1560462 (Colo. Ct. App. April 11, 2024)
Section 230 and … Environmental Law?

Here is a recent case that is interesting because the court applied Section 230 to a situation (as far as this author knows) in which Section 230 has not been applied before – the Clean Air Act.
The Clean Air Act makes it illegal for a person, including a company, “to manufacture or sell” a “part or component intended for use with … any motor vehicle” if “a principal effect” of the part or component is to “defeat” emissions controls “and where the person knows or should know” that it is “put to such use.” 42 U.S.C. § 7522(a)(3)(B).
And we know that our old friend Section 230 – a part of the Communications Decency Act (47 U.S.C. § 230(c)(1)) – commands that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This works to establish broad federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service.
Defendants’ product was used to defeat emissions testing
In the case of United States v. EzLynk Sezc, 2024 WL 1349224 (S.D.N.Y., March 28, 2024), the federal government filed suit over the sale of the “EZ Lynk System.” The system was comprised of three parts – hardware that would connect to a car to reprogram its software used in emissions testing, a cloud based service where users could upload “delete tunes” – software that was used to defeat the emissions control software, and a mobile app to coordinate the hardware and the cloud based software.
Defendants moved to dismiss, arguing that it was immune under Section 230. The court granted the motion.
Section 230 immunity
The court noted that to satisfy the test for immunity: (1) the defendant must be a provider or user of an interactive computer service; (2) the claim must be based on information provided by another information content provider; and (3) “the claim would treat the defendant as the publisher or speaker of that information. It found that all three of these elements were met.
The system was an interactive computer service
On the question of whether defendants provided an interactive computer service, the court rejected the government’s suggestion that Section 230’s immunity was limited to social media platforms. “Software is information, albeit presented in code. The Complaint alleges the EZ Lynk Cloud is a platform on which people exchange information in the form of software. . . . Thus, according to the government’s own account of the nature of an interactive computer service, the Complaint alleges that the EZ Lynk Defendants provide an interactive computer service.”
Claim based on information provided by third parties, of which defendants were not the speaker
Seeking to avoid Section 230 immunity, the government sought to hold defendants liable for their own conduct. It claimed defendants were themselves information content providers who bore responsibility for the creation and installation of the delete tunes. But the court looked to the language of the complaint itself that expressly alleged that the delete tunes were created by third party companies and individuals. And the court found it could not infer from the allegations in the complaint that defendants collaborated with the third party software providers who uploaded the delete tunes. The court likewise rejected the government’s assertions that defendants’ technical support online communications and social media activity contributed to any misconduct on the part of defendants.
United States v. EzLynk Sezc, 2024 WL 1349224 (S.D.N.Y., March 28, 2024)
See also:
