Disabled veteran’s $77 billion lawsuit against Amazon dismissed

gaming law

A disabled Army veteran sued Amazon alleging “cyberstalking” and “cyberbullying” on its gaming platform, New World. Plaintiff claimed Amazon allowed other players and employees to engage in harassment, culminating in his being banned from the platform after over 10,000 hours and $1,700 of investment. Plaintiff sought $7 billion in compensatory damages and $70 billion in punitive damages, asserting claims for intentional infliction of emotional distress, gross negligence, and unfair business practices. Plaintiff also filed motions for a preliminary injunction to reinstate his gaming account and to remand the case to state court.

The court, however, dismissed the case. It granted plaintiff in forma pauperis status, allowing him to proceed without paying court fees, but ruled that his complaint failed to state any claim upon which relief could be granted. The court found no grounds for allowing plaintiff to amend the complaint, as any amendment would be futile.

The court dismissed the case on several legal principles. First, it found that Amazon was immune from liability under the Communications Decency Act at 47 U.S.C. §230 for any content posted by third-party users on the New World platform. Section 230 protects providers of interactive computer services from being treated as publishers or speakers of user-generated content, even if they moderate or fail to moderate that content.

Second, plaintiff’s claims about Amazon employees’ conduct were legally insufficient. His allegations, such as complaints about bad customer service and being banned from the platform, failed to meet the standard for intentional infliction of emotional distress, which requires conduct so outrageous it exceeds all bounds tolerated in a civilized society. Similarly, plaintiff’s gross negligence claims did not demonstrate any extreme departure from reasonable conduct.

Finally, in the court’s view, plaintiff’s claim under California’s Unfair Competition Law (UCL) lacked the necessary specificity. The court found that poor customer service and banning a user from a platform did not constitute unlawful, unfair, or fraudulent business practices under the UCL.

Three Reasons Why This Case Matters

  • Clarifies Section 230 Protections: The case reinforces the broad immunity granted to online platforms for third-party content under Section 230, even when moderation decisions are involved.
  • Defines the Limits of Tort Law in Online Interactions: It highlights the high bar plaintiffs must meet to succeed on claims such as intentional infliction of emotional distress and gross negligence in digital contexts.
  • Sets Guidance for Gaming Platform Disputes: The decision underscores the limited liability of companies for banning users or providing subpar customer support, offering guidance for similar lawsuits.

Haymore v. Amazon.com, Inc., 2024 WL 4825253 (E.D. Cal., Nov. 19, 2024)

Murdered Uber passenger’s mom can keep her case in court and out of arbitration

An Uber driver murdered plaintiff’s son. So plaintiff – the Uber user’s mom – sued Uber for wrongful death. The lower court threw out the case, saying that the Uber terms and conditions required the matter to go to arbitration. Plaintiff sought review with the Georgia Court of Appeals. On review, the court reversed and sent the case back to the lower court.

The appellate court found that it was improper to dismiss the case because it was not clear that plaintiff’s son – the one killed by the Uber driver – actually agreed to the Uber terms and conditions that contained the provision requiring arbitration.

First, there was a dispute as to whether he even saw the link to the terms and conditions when he signed up for Uber in 2016. That’s because he was using an Android phone, and plaintiff alleged the on-screen keyboard within the app may have covered up the link to the terms and conditions.

Second, the court noted that even though Uber submitted evidence it emailed updated terms and conditions to plaintiff’s son, and that he continued using Uber thereafter (thereby binding him to the terms), it was unclear that the email was ever sent to plaintiff’s son. If the customer never saw those terms, they would not apply, and therefore arbitration would not be proper.

Thornton v. Uber Technologies, Inc., 2021 WL 1960199 (Ct. App. Ga. May 17, 2021)

Amazon faces liability for assuming a duty to act, by sending email warning of hoverboard fires

Online marketplaces should take note – sometimes trying to do the right thing will create more legal exposure. 

Plaintiffs tragically lost their home and suffered injuries in a fire caused by a hoverboard they bought through Amazon. They sued Amazon. Their negligence claim arose under Tennessee tort law, arising from the principle set out in Restatement (Second) of Torts § 324A, which states:

One who undertakes, gratuitously or for consideration, to render services to another which he should recognize as necessary for the protection of a third person or his things, is subject to liability to the third person for physical harm resulting from his failure to exercise reasonable care to protect his undertaking if (a) his failure to exercise reasonable care increases the risk of such harm, or (b) he has undertaken to perform a duty owed by the other to the third person, or (c) the harm is suffered because of reliance of the other or the third person upon the undertaking.

Plaintiffs claimed that defendant Amazon gratuitously undertook to warn the purchaser of the hoverboard (one of the plaintiffs) of the dangers posed by the hoverboard when it sent her an email outlining some of the dangers with hoverboards. Plaintiffs claimed that Amazon was negligent in that undertaking, and that the negligence caused plaintiffs harm.

The lower court granted summary judgment in Amazon’s favor, but the Sixth Circuit reversed the summary judgment order. It held that when Amazon chose to send the email to the one plaintiff, and in so doing sought to warn her of the dangers posed by the hoverboard, it assumed a duty to warn. There remained genuine issues of material fact as to whether Amazon breached that duty and whether any breach caused plaintiffs’ harm.

For instance, there was a genuine issue of material fact regarding whether Amazon’s failure to include certain information in the email amounted to negligence. The email did not inform hoverboard purchasers of any of the actions Amazon had taken to evaluate the dangers posed by hoverboards, including the findings and results of its internal investigation. The email did not inform hoverboard purchasers that the reported safety issues included a risk of fire or explosion. And the email did not inform hoverboard purchasers that Amazon had ceased all hoverboard sales worldwide.

And there was a genuine issue of material fact regarding whether the plaintiff read the email, and thereby could have acted in reliance on it. Though plaintiff had no specific recollection of reading the email, she “had a habit” of reading emails sent to her email address. She also testified that she would not have let the hoverboard enter or remain in her home had she known, among other things, that there had been 17 complaints of fires or explosions in the United States that involved hoverboards purchased on Amazon, that Amazon anticipated additional complaints, particularly during the upcoming holiday season, or that Amazon had ceased all hoverboard sales worldwide.

Fox v. Amazon.com, Inc., 2019 WL 2417391 (6th Cir. June 10, 2019)

Scroll to top