Section 230 protected Meta from Huckabee cannabis lawsuit

Mike Huckabee, the former governor of Arkansas, sued Meta Platforms, Inc., the parent company of Facebook, for using his name and likeness without his permission in advertisements for CBD products. Huckabee argued that these ads falsely claimed he endorsed the products and made misleading statements about his personal health. He asked the court to hold Meta accountable under various legal theories, including violation of his publicity rights and privacy.

Plaintiff alleged that defendant approved and maintained advertisements that misappropriated plaintiff’s name, image, and likeness. Plaintiff further claimed that the ads placed plaintiff in a false light by attributing statements and endorsements to him that he never made. Additionally, plaintiff argued that defendant had been unjustly enriched by profiting from these misleading ads. Defendant, however, sought to dismiss the claims, relying on the Communications Decency Act at 47 U.S.C. 230, which grants immunity to platforms for third-party content.

The court granted Meta’s motion to dismiss. It determined that Section 230 shielded defendant from liability for the third-party content at issue. The court also noted that plaintiff’s allegations lacked the specificity needed to overcome the protections provided by Section 230. Furthermore, the court emphasized that federal law, such as Section 230, preempts conflicting state laws, such as Arkansas’s Frank Broyles Publicity Protection Act.

Three reasons why this case matters:

  • Defines Section 230 Protections: It reaffirms the broad immunity tech companies enjoy under Section 230, even in cases involving misuse of publicity rights.
  • Digital Rights and Privacy: The case highlights the tension between protecting individual rights and maintaining the free flow of online content.
  • Challenges for State Laws: It shows how federal law can preempt state-specific protections, leaving individuals with limited recourse.

Mike Huckabee v. Meta Platforms, Inc., 2024 WL 4817657 (D. Del. Nov. 18, 2024)

Disabled veteran’s $77 billion lawsuit against Amazon dismissed

gaming law

A disabled Army veteran sued Amazon alleging “cyberstalking” and “cyberbullying” on its gaming platform, New World. Plaintiff claimed Amazon allowed other players and employees to engage in harassment, culminating in his being banned from the platform after over 10,000 hours and $1,700 of investment. Plaintiff sought $7 billion in compensatory damages and $70 billion in punitive damages, asserting claims for intentional infliction of emotional distress, gross negligence, and unfair business practices. Plaintiff also filed motions for a preliminary injunction to reinstate his gaming account and to remand the case to state court.

The court, however, dismissed the case. It granted plaintiff in forma pauperis status, allowing him to proceed without paying court fees, but ruled that his complaint failed to state any claim upon which relief could be granted. The court found no grounds for allowing plaintiff to amend the complaint, as any amendment would be futile.

The court dismissed the case on several legal principles. First, it found that Amazon was immune from liability under the Communications Decency Act at 47 U.S.C. §230 for any content posted by third-party users on the New World platform. Section 230 protects providers of interactive computer services from being treated as publishers or speakers of user-generated content, even if they moderate or fail to moderate that content.

Second, plaintiff’s claims about Amazon employees’ conduct were legally insufficient. His allegations, such as complaints about bad customer service and being banned from the platform, failed to meet the standard for intentional infliction of emotional distress, which requires conduct so outrageous it exceeds all bounds tolerated in a civilized society. Similarly, plaintiff’s gross negligence claims did not demonstrate any extreme departure from reasonable conduct.

Finally, in the court’s view, plaintiff’s claim under California’s Unfair Competition Law (UCL) lacked the necessary specificity. The court found that poor customer service and banning a user from a platform did not constitute unlawful, unfair, or fraudulent business practices under the UCL.

Three Reasons Why This Case Matters

  • Clarifies Section 230 Protections: The case reinforces the broad immunity granted to online platforms for third-party content under Section 230, even when moderation decisions are involved.
  • Defines the Limits of Tort Law in Online Interactions: It highlights the high bar plaintiffs must meet to succeed on claims such as intentional infliction of emotional distress and gross negligence in digital contexts.
  • Sets Guidance for Gaming Platform Disputes: The decision underscores the limited liability of companies for banning users or providing subpar customer support, offering guidance for similar lawsuits.

Haymore v. Amazon.com, Inc., 2024 WL 4825253 (E.D. Cal., Nov. 19, 2024)

Section 230 saves eBay from liability for violation of environmental laws

The United States government sued eBay for alleged violations of environmental regulations, claiming the online marketplace facilitated the sale of prohibited products in violation of the Clean Air Act (CAA), the Toxic Substances Control Act (TSCA), and the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA). According to the government’s complaint, eBay allowed third-party sellers to list and distribute items that violated these statutes, including devices that tamper with vehicle emissions controls, products containing methylene chloride used in paint removal, and unregistered pesticides.

eBay moved to dismiss, arguing that the government had failed to adequately state a claim under the CAA, TSCA, and FIFRA, and further contended that eBay was shielded from liability under Section 230 of the Communications Decency Act (CDA), 47 U.S.C. 230(c).

The court granted eBay’s motion to dismiss. It held that eBay was immune from liability because of Section 230, which protects online platforms in most situations from being held liable as publishers of third-party content. The court determined that, as a marketplace, eBay did not “sell” or “offer for sale” the products in question in the sense required by the environmental statutes, since it did not possess, own, or transfer title of the items listed by third-party sellers.

The court found that Section 230 provided broad immunity for eBay’s role as an online platform, preventing it from being treated as the “publisher or speaker” of content provided by its users. As the government sought to impose liability based on eBay’s role in hosting third-party listings, the court concluded that the claims were barred under the CDA.

United States of America v. eBay Inc., 2024 WL 4350523 (E.D.N.Y. September 30, 2024)

No Section 230 immunity for Facebook on contract-related claims

section 230

Plaintiffs sued Meta, claiming that they were harmed by fraudulent third-party ads posted on Facebook. Plaintiffs argued that these ads violated Meta’s own terms of service, which prohibits deceptive advertisements. They accused Meta of allowing scammers to run ads that targeted vulnerable users and of prioritizing revenue over user safety. Meta moved to dismiss claiming that it was immune from liability under 47 U.S.C. § 230(c)(1) (a portion of the Communications Decency Act (CDA)), which generally protects internet platforms from being held responsible for third-party content.

Plaintiffs asked the district court to hold Meta accountable for five claims: negligence, breach of contract, breach of the covenant of good faith and fair dealing, violation of California’s Unfair Competition Law (UCL), and unjust enrichment. They alleged that Meta not only failed to remove scam ads but actively solicited them, particularly from advertisers based in China, who accounted for a large portion of the fraudulent activity on the platform.

The district court held that § 230(c)(1) protected Meta from all claims, even the contract claims. Plaintiffs sought review with the Ninth Circuit.

On appeal, the Ninth Circuit affirmed that § 230(c)(1) provided Meta with immunity for the non-contract claims, such as negligence and UCL violations, because these claims treated Meta as a publisher of third-party ads. But the Ninth Circuit disagreed with the district court’s ruling on the contract-related claims. It held that the lower court had applied the wrong legal standard when deciding whether § 230(c)(1) barred those claims. So the court vacated the dismissal of the contract claims, explaining that contract claims were different because they arose from Meta’s promises to users, not from its role as a publisher. The case was remanded back to the district court to apply the correct standard for the contract claims.

Three reasons why this case matters:

  • It clarifies that § 230(c)(1) of the CDA does not provide blanket immunity for all types of claims, especially contract-related claims.
  • The case underscores the importance of holding internet companies accountable for their contractual promises to users, even when they enjoy broad protections for third-party content.
  • It shows that courts continue to wrestle with the boundaries of platform immunity under the CDA, which could shape future rulings about online platforms’ responsibilities.

Calise v. Meta Platforms, Inc., 103 F.4th 732 (9th Cir., June 4, 2024)

Section 230 protected President Trump from defamation liability

TRUMP 230

Plaintiff sued the Trump campaign, some of the President’s advisors and several conservative media outlets asserting claims for defamation. Plaintiff – an employee of voting systems maker Dominion – claimed defendants slandered him by saying plaintiff had said he was going to make sure Trump would not win the 2020 election.

The Trump campaign had argued that two retweets – one by Donald Trump and another by his son Eric – could not form the basis for liability because Section 230 shielded the two from liability. The lower court rejected the Section 230 argument. But on review, the Colorado Court of Appeals held that Section 230 immunity should apply to these retweets.

Section 230 shields users of interactive computer services from liability arising from information provided by third parties. The facts of the case showed that both President Trump and Eric Trump simply retweeted a Gateway Pundit article and an One America Network article without adding any new defamatory content.

The court specifically rejected plaintiff’s argument that Section 230 immunity should not apply because of the Trump defendants’ knowledge that the retweeted information was defamatory. The court looked to a broader consensus of courts that hold such an idea is not woven into Section 230 imm.

The case supports the proposition that defendants could repost verbatim content that someone else generated – even with knowledge that the content is defamatory – and not face liability.

Coomer v. Donald J. Trump for President, Inc., — P.3d —, 2024 WL 1560462  (Colo. Ct. App. April 11, 2024)

Section 230 and … Environmental Law?

section 230 environmental law

Here is a recent case that is interesting because the court applied Section 230 to a situation (as far as this author knows) in which Section 230 has not been applied before – the Clean Air Act.

The Clean Air Act makes it illegal for a person, including a company, “to manufacture or sell” a “part or component intended for use with … any motor vehicle” if “a principal effect” of the part or component is to “defeat” emissions controls “and where the person knows or should know” that it is “put to such use.” 42 U.S.C. § 7522(a)(3)(B).

And we know that our old friend Section 230 – a part of the Communications Decency Act (47 U.S.C. § 230(c)(1)) – commands that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This works to establish broad federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service.

Defendants’ product was used to defeat emissions testing

In the case of United States v. EzLynk Sezc, 2024 WL 1349224 (S.D.N.Y., March 28, 2024), the federal government filed suit over the sale of the “EZ Lynk System.” The system was comprised of three parts – hardware that would connect to a car to reprogram its software used in emissions testing, a cloud based service where users could upload “delete tunes” – software that was used to defeat the emissions control software, and a mobile app to coordinate the hardware and the cloud based software.

Defendants moved to dismiss, arguing that it was immune under Section 230. The court granted the motion.

Section 230 immunity

The court noted that to satisfy the test for immunity: (1) the defendant must be a provider or user of an interactive computer service; (2) the claim must be based on information provided by another information content provider; and (3) “the claim would treat the defendant as the publisher or speaker of that information. It found  that all three of these elements were met.

The system was an interactive computer service

On the question of whether defendants provided an interactive computer service, the court rejected the government’s suggestion that Section 230’s immunity was limited to social media platforms. “Software is information, albeit presented in code. The Complaint alleges the EZ Lynk Cloud is a platform on which people exchange information in the form of software. . . . Thus, according to the government’s own account of the nature of an interactive computer service, the Complaint alleges that the EZ Lynk Defendants provide an interactive computer service.”

Claim based on information provided by third parties, of which defendants were not the speaker

Seeking to avoid Section 230 immunity, the government sought to hold defendants liable for their own conduct. It claimed defendants were themselves information content providers who bore responsibility for the creation and installation of the delete tunes. But the court looked to the language of the complaint itself that expressly alleged that the delete tunes were created by third party companies and individuals. And the court found it could not infer from the allegations in the complaint that defendants collaborated with the third party software providers who uploaded the delete tunes. The court likewise rejected the government’s assertions that defendants’ technical support online communications and social media activity contributed to any misconduct on the part of defendants.

United States v. EzLynk Sezc, 2024 WL 1349224 (S.D.N.Y., March 28, 2024)

See also:

CCPA claim against Apple thrown out on Section 230 grounds

Plaintiffs sued Apple after downloading a malicious app from the App Store. The claims included violation of the Computer Fraud and Abuse Act (“CFAA”), the Electronic Communications Privacy Act (“ECPA”), and the California Consumer Privacy Act (“CCPA). (Alphabet soup, anyone?)

The lower court granted Apple’s motion to dismiss these claims. Plaintiffs sought review with the Ninth Circuit Court of Appeals. On appeal, the court held that the lower court properly applied Section 230 immunity to dismiss these claims.

What Section 230 does

Section 230 (47 U.S.C. § 230) instructs that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” A defendant is not liable if it can show that (1) it is a provider of “interactive computer services” as defined by the statute, (2) the claim relates to “information provided by another content provider,” and (3) the claim seeks to hold defendant liable as the “publisher or speaker” of that information.

Why the CFAA and ECPA claims were dismissed

In this case, concerning the CFAA and ECPA claims, the court looked to Barnes v. Yahoo!, Inc., 570 F.3d 1096 (9th Cir. 2009) and concluded that the lower court properly found Section 230 immunity to apply. The duty that plaintiffs alleged Apple violated derived from Apple’s status or conduct as a “publisher or speaker.” It found that the claims referred, as the basis for culpability, to Apple’s authorization, monitoring, or failure to remove the offending app from the App Store. “Because these are quintessential “publication decisions” under  Barnes, 570 F.3d at 1105, liability is barred by  section 230(c)(1).”

Section 230 knocked out CCPA claim too

The data privacy count included allegations that Apple violated duties to “implement reasonable security procedures and practices” to protect the personal information of App Store users, in violation of  Cal. Civ. Code § 1798.100(e). The court said that it need not decide whether violations of such duties can be boiled down to publication activities in every instance or whether implementation of reasonable security policies and practices would always necessarily require an internet company to monitor third-party content. Citing to Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021) the court found that in this case, at least, plaintiffs failed to plead adequately a theory of injury under CCPA that was “fully independent of [Apple’s] role in monitoring or publishing third-party content.”

Diep v. Apple, Inc., 2024 WL 1299995 (9th Cir. March 27, 2024)

Section 230 immunity protected provider of ringless voicemail services to telemarketers

Defendant telecommunication services provider provided ringless voicemail services and VoIP services to telemarketers. These services enabled telemarketers to mass deliver prerecorded messages directly to recipients’ voicemail inboxes without causing the recipients’ phones to ring or giving recipients the opportunity to answer or block the call.

The federal government sued a couple of telemarketers and defendant alleging violation of the FTC Act, which prohibits unfair or deceptive acts or practices in commerce. Defendant moved to dismiss the action, arguing that Section 230 provided it immunity from liability. The court granted the motion.

Section 230 immunity

Section 230(c) (at 47 U.S.C. 230(c)) provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Defendant asserted it met the criteria for Section 230 immunity because of (1) its role as an interactive computer service, (2) the way the government’s claims sought to treat it as a publisher or speaker of the allegedly unlawful calls, and (3) the potential liability was based on third party content (the calls being placed by the other telemarketing defendants).

Ringless voicemail services were an “interactive computer service”

The government argued defendant was not an “interactive computer service” because recipients accessed their voicemails through their telephones rather than a computer. The court rejected this argument, finding that defendant had shown that it transmitted content and provided access to multiple users to a computer server, thereby meeting the statutory definition of an interactive computer service.

Lawsuit sought to treat defendant as a publisher or speaker

The government next argued that its claims against defendant did not seek to treat defendant as the publisher or speaker of content, because defendant’s liability did not depend on the content of the transmitted messages. The court likewise rejected this argument as well because it was indeed the content that gave rise to liability – had the voicemails at issue not been for commercial purposes, they would not have been unlawful, and the matter would not have been brought in the first place.

Allegations related to the content of unlawful voicemails

Finally, as for the third element of Section 230 immunity – the offending content being provided by a third party – the court also sided with defendant. “While [defendant] developed the ringless voicemail technology at issue, that development goes to how the third-party content is distributed rather than the content itself.”

United States v. Stratics Networks Inc., 2024 WL 966380 (S.D. Cal., March 6, 2024)

See also:

Woz gets another (small) bite at the apple in YouTube bitcoin scam case

Apple co-founder Steve Wozniak sued YouTube and Google asserting various causes of action, including misappropriation of likeness, fraud, and negligence. The case arose from a common scam on YouTube, where popular channels are hijacked to show fake videos of a celebrity hosting a live event during which viewers are falsely told that anyone who sends cryptocurrency to a specified account will receive twice as much in return. Woz’s YouTube account was hijacked for these purposes, and several of the resulting victims joined him in the lawsuit.

The lower court tossed the case, holding that YouTube and Google were not liable because of Section 230 – which provides that the platforms could not be liable for the third party content giving rise to the scam. Woz and the other defendants sought review with the California Court of Appeal, which largely agreed with the lower court on the Section 230 issue, except for one part. The court allowed plaintiffs to file an amended complaint on this one issue.

Plaintiffs claimed that Google and YouTube contributed to scam ads and videos, thereby positioning defendants outside Section 230 immunity. They argued, among other things, that YouTube displayed false verification badges, thereby becoming active content providers contributing to the scam’s fraudulent nature.

The court found that although plaintiffs’ complaint suggested that defendants’ actions could strip them of Section 230 immunity by implying a level of endorsement or authenticity, the allegations were too conclusory as written to establish defendants as information content providers. So the court allowed for the possibility of amending these claims, indicating that a more detailed argument might better establish defendants’ direct contribution to the content’s illegality.

Wozniak v. YouTube, LLC, — Cal.Rptr.3d —, 2024 WL 1151750 (Cal.App. 6th Dist., March 15, 2024)

See also:

 

Section 230 protects Snapchat against lawsuit brought by assault victim

section 230

A young girl named C.O. found much misfortune using Snapchat. Her parents (the plaintiffs in this lawsuit) alleged that the app’s features caused her to become addicted to the app, to be exposed to sexual content, and to eventually be victimized on two occasions, including once by a registered sex offender.

Suing Snapchat

Plaintiffs sued Snap and related entities, asserting claims including strict product liability, negligence, and invasion of privacy, emphasizing the platform’s failure to protect minors and address reported abuses. Defendants moved to strike the complaint.

The court granted the motion to strike. It held that the allegations of the complaint fell squarely within the ambit of immunity afforded under Section 230 to “an interactive computer service” that acts as as a “publisher or speaker” of information provided by another “information content provider.” Plaintiffs “clearly allege[d] that the defendants failed to regulate content provided by third parties” when such third parties used Snapchat to harm plaintiff.

Publisher or speaker? How about those algorithms!

Plaintiffs had argued that their claims did not seek to treat defendants as publishers or speakers, and therefore Section 230 immunity did not apply. Instead, plaintiffs argued, they were asserting claims that defendants breached their duty as manufacturers to design a reasonably safe product.

Of particular interest was the plaintiffs’ claim concerning Snapchat’s algorithms which recommended connections and which allegedly caused children to become addicted. But in line with the case of Force v. Facebook, Inc., 934 F.3d 53 (2nd Cir. 2019), the court refused to find that use of algorithms in this way was outside the traditional role of a publisher. It was careful to distinguish the case from Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir 2021), in which that court held Section 230 did not immunize Snapchat from products liability claims. In that case, the harm to plaintiffs did not result from third party content but rather from the design of the platform which tempted the users to drive fast. In this case, the harm to plaintiffs was the result of particular actions of third parties who had transmitted content using Snapchat, to lure C.O.

Sad facts, sad result

The court seemed to express some trepidation about its result, using the same language the First Circuit Court of Appeals used in Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 15 (1st Cir. 2016): “This is a hard case-hard not in the sense that the legal issues defy resolution, but hard in the sense that the law requires that [the court] … deny relief to plaintiffs whose circumstances evoke outrage.” And citing from Vazquez v. Buhl, 90 A.3d 331 (2014), the court observed that “[w]ithout further legislative action, however, there is little [this court can] do in [its] limited role but join with other courts and commentators in expressing [its] concern with the statute’s broad scope.”

V.V. v. Meta Platforms, Inc. et al., 2024 WL 678248 (Conn. Super. Ct., February 16, 2024)

See also:

Scroll to top