Disabled veteran’s $77 billion lawsuit against Amazon dismissed

gaming law

A disabled Army veteran sued Amazon alleging “cyberstalking” and “cyberbullying” on its gaming platform, New World. Plaintiff claimed Amazon allowed other players and employees to engage in harassment, culminating in his being banned from the platform after over 10,000 hours and $1,700 of investment. Plaintiff sought $7 billion in compensatory damages and $70 billion in punitive damages, asserting claims for intentional infliction of emotional distress, gross negligence, and unfair business practices. Plaintiff also filed motions for a preliminary injunction to reinstate his gaming account and to remand the case to state court.

The court, however, dismissed the case. It granted plaintiff in forma pauperis status, allowing him to proceed without paying court fees, but ruled that his complaint failed to state any claim upon which relief could be granted. The court found no grounds for allowing plaintiff to amend the complaint, as any amendment would be futile.

The court dismissed the case on several legal principles. First, it found that Amazon was immune from liability under the Communications Decency Act at 47 U.S.C. §230 for any content posted by third-party users on the New World platform. Section 230 protects providers of interactive computer services from being treated as publishers or speakers of user-generated content, even if they moderate or fail to moderate that content.

Second, plaintiff’s claims about Amazon employees’ conduct were legally insufficient. His allegations, such as complaints about bad customer service and being banned from the platform, failed to meet the standard for intentional infliction of emotional distress, which requires conduct so outrageous it exceeds all bounds tolerated in a civilized society. Similarly, plaintiff’s gross negligence claims did not demonstrate any extreme departure from reasonable conduct.

Finally, in the court’s view, plaintiff’s claim under California’s Unfair Competition Law (UCL) lacked the necessary specificity. The court found that poor customer service and banning a user from a platform did not constitute unlawful, unfair, or fraudulent business practices under the UCL.

Three Reasons Why This Case Matters

  • Clarifies Section 230 Protections: The case reinforces the broad immunity granted to online platforms for third-party content under Section 230, even when moderation decisions are involved.
  • Defines the Limits of Tort Law in Online Interactions: It highlights the high bar plaintiffs must meet to succeed on claims such as intentional infliction of emotional distress and gross negligence in digital contexts.
  • Sets Guidance for Gaming Platform Disputes: The decision underscores the limited liability of companies for banning users or providing subpar customer support, offering guidance for similar lawsuits.

Haymore v. Amazon.com, Inc., 2024 WL 4825253 (E.D. Cal., Nov. 19, 2024)

Section 230 saves eBay from liability for violation of environmental laws

The United States government sued eBay for alleged violations of environmental regulations, claiming the online marketplace facilitated the sale of prohibited products in violation of the Clean Air Act (CAA), the Toxic Substances Control Act (TSCA), and the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA). According to the government’s complaint, eBay allowed third-party sellers to list and distribute items that violated these statutes, including devices that tamper with vehicle emissions controls, products containing methylene chloride used in paint removal, and unregistered pesticides.

eBay moved to dismiss, arguing that the government had failed to adequately state a claim under the CAA, TSCA, and FIFRA, and further contended that eBay was shielded from liability under Section 230 of the Communications Decency Act (CDA), 47 U.S.C. 230(c).

The court granted eBay’s motion to dismiss. It held that eBay was immune from liability because of Section 230, which protects online platforms in most situations from being held liable as publishers of third-party content. The court determined that, as a marketplace, eBay did not “sell” or “offer for sale” the products in question in the sense required by the environmental statutes, since it did not possess, own, or transfer title of the items listed by third-party sellers.

The court found that Section 230 provided broad immunity for eBay’s role as an online platform, preventing it from being treated as the “publisher or speaker” of content provided by its users. As the government sought to impose liability based on eBay’s role in hosting third-party listings, the court concluded that the claims were barred under the CDA.

United States of America v. eBay Inc., 2024 WL 4350523 (E.D.N.Y. September 30, 2024)

No Section 230 immunity for Facebook on contract-related claims

section 230

Plaintiffs sued Meta, claiming that they were harmed by fraudulent third-party ads posted on Facebook. Plaintiffs argued that these ads violated Meta’s own terms of service, which prohibits deceptive advertisements. They accused Meta of allowing scammers to run ads that targeted vulnerable users and of prioritizing revenue over user safety. Meta moved to dismiss claiming that it was immune from liability under 47 U.S.C. § 230(c)(1) (a portion of the Communications Decency Act (CDA)), which generally protects internet platforms from being held responsible for third-party content.

Plaintiffs asked the district court to hold Meta accountable for five claims: negligence, breach of contract, breach of the covenant of good faith and fair dealing, violation of California’s Unfair Competition Law (UCL), and unjust enrichment. They alleged that Meta not only failed to remove scam ads but actively solicited them, particularly from advertisers based in China, who accounted for a large portion of the fraudulent activity on the platform.

The district court held that § 230(c)(1) protected Meta from all claims, even the contract claims. Plaintiffs sought review with the Ninth Circuit.

On appeal, the Ninth Circuit affirmed that § 230(c)(1) provided Meta with immunity for the non-contract claims, such as negligence and UCL violations, because these claims treated Meta as a publisher of third-party ads. But the Ninth Circuit disagreed with the district court’s ruling on the contract-related claims. It held that the lower court had applied the wrong legal standard when deciding whether § 230(c)(1) barred those claims. So the court vacated the dismissal of the contract claims, explaining that contract claims were different because they arose from Meta’s promises to users, not from its role as a publisher. The case was remanded back to the district court to apply the correct standard for the contract claims.

Three reasons why this case matters:

  • It clarifies that § 230(c)(1) of the CDA does not provide blanket immunity for all types of claims, especially contract-related claims.
  • The case underscores the importance of holding internet companies accountable for their contractual promises to users, even when they enjoy broad protections for third-party content.
  • It shows that courts continue to wrestle with the boundaries of platform immunity under the CDA, which could shape future rulings about online platforms’ responsibilities.

Calise v. Meta Platforms, Inc., 103 F.4th 732 (9th Cir., June 4, 2024)

Section 230 protected President Trump from defamation liability

TRUMP 230

Plaintiff sued the Trump campaign, some of the President’s advisors and several conservative media outlets asserting claims for defamation. Plaintiff – an employee of voting systems maker Dominion – claimed defendants slandered him by saying plaintiff had said he was going to make sure Trump would not win the 2020 election.

The Trump campaign had argued that two retweets – one by Donald Trump and another by his son Eric – could not form the basis for liability because Section 230 shielded the two from liability. The lower court rejected the Section 230 argument. But on review, the Colorado Court of Appeals held that Section 230 immunity should apply to these retweets.

Section 230 shields users of interactive computer services from liability arising from information provided by third parties. The facts of the case showed that both President Trump and Eric Trump simply retweeted a Gateway Pundit article and an One America Network article without adding any new defamatory content.

The court specifically rejected plaintiff’s argument that Section 230 immunity should not apply because of the Trump defendants’ knowledge that the retweeted information was defamatory. The court looked to a broader consensus of courts that hold such an idea is not woven into Section 230 imm.

The case supports the proposition that defendants could repost verbatim content that someone else generated – even with knowledge that the content is defamatory – and not face liability.

Coomer v. Donald J. Trump for President, Inc., — P.3d —, 2024 WL 1560462  (Colo. Ct. App. April 11, 2024)

Section 230 protects Snapchat against lawsuit brought by assault victim

section 230

A young girl named C.O. found much misfortune using Snapchat. Her parents (the plaintiffs in this lawsuit) alleged that the app’s features caused her to become addicted to the app, to be exposed to sexual content, and to eventually be victimized on two occasions, including once by a registered sex offender.

Suing Snapchat

Plaintiffs sued Snap and related entities, asserting claims including strict product liability, negligence, and invasion of privacy, emphasizing the platform’s failure to protect minors and address reported abuses. Defendants moved to strike the complaint.

The court granted the motion to strike. It held that the allegations of the complaint fell squarely within the ambit of immunity afforded under Section 230 to “an interactive computer service” that acts as as a “publisher or speaker” of information provided by another “information content provider.” Plaintiffs “clearly allege[d] that the defendants failed to regulate content provided by third parties” when such third parties used Snapchat to harm plaintiff.

Publisher or speaker? How about those algorithms!

Plaintiffs had argued that their claims did not seek to treat defendants as publishers or speakers, and therefore Section 230 immunity did not apply. Instead, plaintiffs argued, they were asserting claims that defendants breached their duty as manufacturers to design a reasonably safe product.

Of particular interest was the plaintiffs’ claim concerning Snapchat’s algorithms which recommended connections and which allegedly caused children to become addicted. But in line with the case of Force v. Facebook, Inc., 934 F.3d 53 (2nd Cir. 2019), the court refused to find that use of algorithms in this way was outside the traditional role of a publisher. It was careful to distinguish the case from Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir 2021), in which that court held Section 230 did not immunize Snapchat from products liability claims. In that case, the harm to plaintiffs did not result from third party content but rather from the design of the platform which tempted the users to drive fast. In this case, the harm to plaintiffs was the result of particular actions of third parties who had transmitted content using Snapchat, to lure C.O.

Sad facts, sad result

The court seemed to express some trepidation about its result, using the same language the First Circuit Court of Appeals used in Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 15 (1st Cir. 2016): “This is a hard case-hard not in the sense that the legal issues defy resolution, but hard in the sense that the law requires that [the court] … deny relief to plaintiffs whose circumstances evoke outrage.” And citing from Vazquez v. Buhl, 90 A.3d 331 (2014), the court observed that “[w]ithout further legislative action, however, there is little [this court can] do in [its] limited role but join with other courts and commentators in expressing [its] concern with the statute’s broad scope.”

V.V. v. Meta Platforms, Inc. et al., 2024 WL 678248 (Conn. Super. Ct., February 16, 2024)

See also:

Zillow gets win in case alleging fraudulent online auction notices

section 230

Plaintiff sued Zillow and some other parties in federal court, claiming they engaged in a conspiracy to defraud her by illegally foreclosing on her home. She apparently claimed that Zillow “illegally” published information regarding the property at issue on its website, including listing it “for auction.”

Zillow moved to dismiss for failure to state a claim. The court granted the motion. It held that Section 230 (47 U.S.C. 230) immunized Zillow from liability. This statute immunizes providers of interactive computer services against liability arising from content created by third parties.

The court found that Zillow was an “interactive computer service,” demonstrated by how its website stated that it is “reimagining the traditional rules of real estate to make it easier than ever to move from one home to the next.”

It also found that plaintiff’s claims sought to hold Zillow liable for posting “auction notices”. But since the court did not believe plaintiff could demonstrate that Zillow developed or created this content, it found that plaintiff’s claims fell squarely within the purview of Section 230.

Choudhuri v. Specialised Loan Servicing, 2024 WL 308258 (N.D. Cal., January 26, 2024)

See also:

 

 

Section 230 protected Meta from claims of discrimination for taking down Palestinian content

meta section 230

Pro se plaintiff sued Meta seeking to hold it liable for allegedly removing certain “Muslim and/or Palestinian content” while preserving “unspecified Jewish and/or Israeli content” and for allegedly banning certain Muslim users, while allowing unspecified Jewish users to continue using Meta’s services. He brought a civil rights claim for unlawful discrimination on the basis of religion in violation of  Title II of the Civil Rights Act of 1964.

Meta moved to dismiss, arguing, among other things, that plaintiff lacked standing. The lower court granted the motion. Plaintiff sought review with the Third Circuit. On appeal, the court affirmed the dismissal.

No “informational injury”

The court observed that plaintiff had not alleged that he owned, created, controlled or had any personal involvement with the removed content other than having previously viewed it. Nor had he alleged any personal involvement with the banned users. Likewise, he had not argued that he was denied the same level of service that Meta offered to all its users. Instead, he had argued that he was entitled to relief as a Muslim being discriminated against by having Muslim-related news removed while Jewish content remained.

The court examined whether plaintiff could establish standing under the “information injury” doctrine. To establish standing under the informational injury doctrine, plaintiff “need[ed] only allege that [he] was denied information to which [he] was legally entitled, and that the denial caused some adverse consequence related to the purpose of the statute.” It went on to note that an entitlement to information allegedly withheld is the “sine qua non” of the informational injury doctrine.

It held that plaintiff had failed to establish standing under this doctrine because he did not show that he was legally entitled to the publication of the requested content or the removal of other content. Title II does not create a right to information. And the statute could not be understood as granting him a right to relief because he did not allege that he was personally denied the full and equal enjoyment of Meta’s services. Moreover, plaintiff was without relief under Title II because the statute is limited to physical structures of accommodations, and Meta, for purposes of the statute was not a “place of public accommodation.”

Section 230 Classics

And in any event, 47 U.S.C. § 230 precluded the court from entertaining these claims, which would have sought to hold Meta liable for its exercise of a publisher’s traditional editorial functions – such as deciding whether to publish, withdraw, postpone, or alter content. On this point, the court looked to the classic Section 230 holdings in Green v. America Online (AOL), 318 F.3d 465,(3d Cir. 2003) and Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997).

Elansari v. Meta, Inc., 2024 WL 163080 (3d. Cir. January 16, 2024) (Not selected for official publication)

See also:

 

Fifth Circuit dissent issues scathing rebuke of broad Section 230 immunity

section 230 immunity

Dissenting in the court’s refusal to rehear an appeal en banc, Judge Elrod of the Fifth Circuit Court of Appeals – joined by six of her colleagues – penned an opinion that sharply criticized the broad immunity granted to social media companies under Section 230 of the Communications Decency Act. The dissent emerged in a case involving John Doe, a minor who was sexually abused by his high school teacher, a crime in which the messaging app Snapchat played a pivotal role.

The Core of the Controversy

Section 230 (47 U.S.C. 230) is a provision that courts have long held to shield internet companies from liability for content posted by their users. The dissenting opinion, however, argues that this immunity has been stretched far beyond its intended scope, potentially enabling platforms to evade responsibility even when their design and operations contribute to illegal activities.

Snapchat’s Role in the Abuse Case

Snapchat, owned by Snap, Inc., was used by the teacher to send sexually explicit material to Doe. Doe sought to hold Snap accountable, alleging that Snapchat’s design defects, such as inadequate age-verification mechanisms, indirectly facilitated the abuse. But the lower court, applying previous cases interpreting Section 230, dismissed these claims at the initial stage.

A Critical Examination of Section 230

The dissent criticized the court’s interpretation of Section 230, arguing that it has been applied too broadly to protect social media companies from various forms of liability, including design defects and distributor responsibilities. It highlighted the statute’s original text, which was meant to protect platforms from being deemed publishers or speakers of third-party content, not to shield them from liability for their own conduct.

Varied Interpretations Across Courts

Notably, the dissent pointed out the inconsistency in judicial interpretations of Section 230. While some courts, like the Ninth Circuit, have allowed claims related to design defects to proceed, others have extended sweeping protections to platforms, significantly limiting the scope for holding them accountable.

The Implications for Internet Liability

This case and the resulting dissent underscore a significant legal issue in the digital age: how to balance the need to protect online platforms from excessive liability with ensuring they do not become facilitators of illegal or harmful activities. The dissent suggested that the current interpretation of Section 230 has tipped this balance too far in favor of the platforms, leaving victims like Doe without recourse.

Looking Ahead: The Need for Reevaluation

The dissenting opinion called for a reevaluation of Section 230, urging a return to the statute’s original text and intent. This reexamination – in the court’s view – would be crucial in the face of evolving internet technologies and the increasing role of social media platforms in everyday life. The dissent warned of the dangers of a legal framework that overly shields these powerful platforms while leaving individuals exposed to the risks associated with their operations.

Conclusion

The court’s dissent in this case is a clarion call for a critical reassessment of legal protections afforded to social media platforms. As the internet continues to evolve, the legal system must adapt to ensure that the balance between immunity and accountability is appropriately maintained, safeguarding individuals’ rights without stifling technological innovation and freedom of expression online.

Doe through Roe v. Snap, Incorporated, — F4th — 2023 WL 8705665, (5th Cir., December 18, 2023)

See also: Snapchat not liable for enabling teacher to groom minor student

Court allows Amazon to censor “Wuhan plague” book reviews

amazon book reviews

In 2015, plaintiff began posting book reviews on Amazon, but in 2019 Amazon revoked his review privileges due to guideline violations, including reviews that criticized Donald Trump and two authors. After arbitration in 2020 favored Amazon, plaintiff and Amazon reached a settlement allowing plaintiff to post reviews if he adhered to Amazon’s policies. However, in 2022, after posting reviews derogatory of millennials and referring to COVID-19 as the “Wuhan plague,” Amazon once again revoked plaintiff’s ability to post book reviews and deleted his prior reviews from the platform.

Plaintiff sued Amazon alleging breach of contract and violation of Washington’s Consumer Protection Act (CPA), and seeking a request for a declaratory judgment saying Section 230 of the Communications Decency Act should not protect Amazon. Plaintiff asserted that Amazon wrongfully removed plaintiff’s reviews and did not adequately explain its actions. The CPA violation centered on Amazon’s insufficient explanations and inconsistent policy enforcement. Amazon sought to dismiss the complaint, arguing there was no legal basis for the breach of contract claim, the other claim lacked merit, and that both the Section 230 and the First Amendment protect Amazon from liability. The court granted Amazon’s motion.

Breach of Contract Claim Tossed

The court noted that to win a breach of contract claim in Washington, plaintiff had to prove a contractual duty was imposed and breached, causing plaintiff to suffer damages. Plaintiff claimed that Amazon breached its contract by banning him from posting book reviews and asserted that Amazon’s Conditions and Guidelines were ambiguous. But the court found that Amazon’s Conditions and Guidelines gave Amazon the exclusive right to remove content or revoke user privileges at its discretion, and that plaintiff’s claim sought to hold Amazon responsible for actions the contract permitted. Similarly, the court found plaintiff’s claims for both breach of contract and breach of the implied duty of good faith and fair dealing to be baseless, as they failed to identify any specific contractual duty Amazon allegedly violated.

No Violation of Washington Consumer Protection Act

To be successful under Washington’s Consumer Protection Act, plaintiff would have had to allege five elements, including an unfair or deceptive act and a public interest impact. The court found that plaintiff’s claim against Amazon, based on the company’s decision to remove reviews, failed to establish an “unfair or deceptive act” since Amazon’s Conditions and Guidelines transparently allowed such actions, and plaintiff presented no evidence showing Amazon’s practices would mislead reasonable consumers. Additionally, plaintiff did not adequately demonstrate a public interest impact, as he did not provide evidence of a widespread pattern of behavior by Amazon or the potential harm to other users. Consequently, plaintiff’s claim was insufficient in two essential areas, rendering the CPA claim invalid.

Section 230 Also Saved the Day for Amazon

Amazon claimed immunity under Section 230(c)(1) of the Communications Decency Act (CDA) against plaintiff’s allegations under the CPA and for breach of the implied duty of good faith and fair dealing. Section 230 of the CDA protects providers of interactive computer services from liability resulting from third-party content (e.g., online messaging boards). For Amazon to receive immunity under this section, it had to show three things: it is an interactive computer service, it is treated by plaintiff as a publisher, and the information in dispute (the book reviews) was provided by another content provider. Given that Amazon met these conditions, the court determined that plaintiff’s claims against Amazon under Washington’s CPA and for breach of the implied duty were barred by Section 230 of the CDA.

As for plaintiff’s declaratory judgment claim regarding Section 230, the court found that since the Declaratory Judgment Act only offers a remedy and not a cause of action, and given the absence of a “substantial controversy,” the Court could not grant this declaratory relief. The court noted that its decision was further reinforced by the court’s conclusion that Section 230 did bar two of plaintiff’s claims.

Haywood v. Amazon.com, Inc., 2023 WL 4585362 (W.D. Washington, July 18, 2023)

See also:

Amazon and other booksellers off the hook for sale of Obama drug use book

Snapchat not liable for enabling teacher to groom minor student

A high school science teacher used Snapchat to send sexually explicit content to one of her students, whom she eventually assaulted. Authorities uncovered this abuse after the student overdosed on drugs. The student (as John Doe) sued the teacher, the school district and Snapchat. The lower court threw out the case against Snapchat on the basis of the federal Communications Decency Act at 47 USC § 230. The student sought review with the United States Court of Appeals for the Fifth Circuit. On appeal, the court affirmed.

Relying on Doe v. MySpace, Inc., 528 F.3d 413 (5th Cir. 2008), the court affirmed the lower court’s finding that the student’s claims against Snapchat were based on the teacher’s messages. Accordingly, Snapchat was immune from liability because this provision of federal law – under the doctrine of the MySpace case – provides “immunity … to Web-based service providers for all claims stemming from their publication of information created by third parties.”

Doe v. Snap, Inc., 2023 WL 4174061 (5th Cir. June 26, 2023)

Scroll to top