Zillow gets win in case alleging fraudulent online auction notices

section 230

Plaintiff sued Zillow and some other parties in federal court, claiming they engaged in a conspiracy to defraud her by illegally foreclosing on her home. She apparently claimed that Zillow “illegally” published information regarding the property at issue on its website, including listing it “for auction.”

Zillow moved to dismiss for failure to state a claim. The court granted the motion. It held that Section 230 (47 U.S.C. 230) immunized Zillow from liability. This statute immunizes providers of interactive computer services against liability arising from content created by third parties.

The court found that Zillow was an “interactive computer service,” demonstrated by how its website stated that it is “reimagining the traditional rules of real estate to make it easier than ever to move from one home to the next.”

It also found that plaintiff’s claims sought to hold Zillow liable for posting “auction notices”. But since the court did not believe plaintiff could demonstrate that Zillow developed or created this content, it found that plaintiff’s claims fell squarely within the purview of Section 230.

Choudhuri v. Specialised Loan Servicing, 2024 WL 308258 (N.D. Cal., January 26, 2024)

See also:

 

 

Section 230 protected Meta from claims of discrimination for taking down Palestinian content

meta section 230

Pro se plaintiff sued Meta seeking to hold it liable for allegedly removing certain “Muslim and/or Palestinian content” while preserving “unspecified Jewish and/or Israeli content” and for allegedly banning certain Muslim users, while allowing unspecified Jewish users to continue using Meta’s services. He brought a civil rights claim for unlawful discrimination on the basis of religion in violation of  Title II of the Civil Rights Act of 1964.

Meta moved to dismiss, arguing, among other things, that plaintiff lacked standing. The lower court granted the motion. Plaintiff sought review with the Third Circuit. On appeal, the court affirmed the dismissal.

No “informational injury”

The court observed that plaintiff had not alleged that he owned, created, controlled or had any personal involvement with the removed content other than having previously viewed it. Nor had he alleged any personal involvement with the banned users. Likewise, he had not argued that he was denied the same level of service that Meta offered to all its users. Instead, he had argued that he was entitled to relief as a Muslim being discriminated against by having Muslim-related news removed while Jewish content remained.

The court examined whether plaintiff could establish standing under the “information injury” doctrine. To establish standing under the informational injury doctrine, plaintiff “need[ed] only allege that [he] was denied information to which [he] was legally entitled, and that the denial caused some adverse consequence related to the purpose of the statute.” It went on to note that an entitlement to information allegedly withheld is the “sine qua non” of the informational injury doctrine.

It held that plaintiff had failed to establish standing under this doctrine because he did not show that he was legally entitled to the publication of the requested content or the removal of other content. Title II does not create a right to information. And the statute could not be understood as granting him a right to relief because he did not allege that he was personally denied the full and equal enjoyment of Meta’s services. Moreover, plaintiff was without relief under Title II because the statute is limited to physical structures of accommodations, and Meta, for purposes of the statute was not a “place of public accommodation.”

Section 230 Classics

And in any event, 47 U.S.C. § 230 precluded the court from entertaining these claims, which would have sought to hold Meta liable for its exercise of a publisher’s traditional editorial functions – such as deciding whether to publish, withdraw, postpone, or alter content. On this point, the court looked to the classic Section 230 holdings in Green v. America Online (AOL), 318 F.3d 465,(3d Cir. 2003) and Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997).

Elansari v. Meta, Inc., 2024 WL 163080 (3d. Cir. January 16, 2024) (Not selected for official publication)

See also:

 

Fifth Circuit dissent issues scathing rebuke of broad Section 230 immunity

section 230 immunity

Dissenting in the court’s refusal to rehear an appeal en banc, Judge Elrod of the Fifth Circuit Court of Appeals – joined by six of her colleagues – penned an opinion that sharply criticized the broad immunity granted to social media companies under Section 230 of the Communications Decency Act. The dissent emerged in a case involving John Doe, a minor who was sexually abused by his high school teacher, a crime in which the messaging app Snapchat played a pivotal role.

The Core of the Controversy

Section 230 (47 U.S.C. 230) is a provision that courts have long held to shield internet companies from liability for content posted by their users. The dissenting opinion, however, argues that this immunity has been stretched far beyond its intended scope, potentially enabling platforms to evade responsibility even when their design and operations contribute to illegal activities.

Snapchat’s Role in the Abuse Case

Snapchat, owned by Snap, Inc., was used by the teacher to send sexually explicit material to Doe. Doe sought to hold Snap accountable, alleging that Snapchat’s design defects, such as inadequate age-verification mechanisms, indirectly facilitated the abuse. But the lower court, applying previous cases interpreting Section 230, dismissed these claims at the initial stage.

A Critical Examination of Section 230

The dissent criticized the court’s interpretation of Section 230, arguing that it has been applied too broadly to protect social media companies from various forms of liability, including design defects and distributor responsibilities. It highlighted the statute’s original text, which was meant to protect platforms from being deemed publishers or speakers of third-party content, not to shield them from liability for their own conduct.

Varied Interpretations Across Courts

Notably, the dissent pointed out the inconsistency in judicial interpretations of Section 230. While some courts, like the Ninth Circuit, have allowed claims related to design defects to proceed, others have extended sweeping protections to platforms, significantly limiting the scope for holding them accountable.

The Implications for Internet Liability

This case and the resulting dissent underscore a significant legal issue in the digital age: how to balance the need to protect online platforms from excessive liability with ensuring they do not become facilitators of illegal or harmful activities. The dissent suggested that the current interpretation of Section 230 has tipped this balance too far in favor of the platforms, leaving victims like Doe without recourse.

Looking Ahead: The Need for Reevaluation

The dissenting opinion called for a reevaluation of Section 230, urging a return to the statute’s original text and intent. This reexamination – in the court’s view – would be crucial in the face of evolving internet technologies and the increasing role of social media platforms in everyday life. The dissent warned of the dangers of a legal framework that overly shields these powerful platforms while leaving individuals exposed to the risks associated with their operations.

Conclusion

The court’s dissent in this case is a clarion call for a critical reassessment of legal protections afforded to social media platforms. As the internet continues to evolve, the legal system must adapt to ensure that the balance between immunity and accountability is appropriately maintained, safeguarding individuals’ rights without stifling technological innovation and freedom of expression online.

Doe through Roe v. Snap, Incorporated, — F4th — 2023 WL 8705665, (5th Cir., December 18, 2023)

See also: Snapchat not liable for enabling teacher to groom minor student

Court allows Amazon to censor “Wuhan plague” book reviews

amazon book reviews

In 2015, plaintiff began posting book reviews on Amazon, but in 2019 Amazon revoked his review privileges due to guideline violations, including reviews that criticized Donald Trump and two authors. After arbitration in 2020 favored Amazon, plaintiff and Amazon reached a settlement allowing plaintiff to post reviews if he adhered to Amazon’s policies. However, in 2022, after posting reviews derogatory of millennials and referring to COVID-19 as the “Wuhan plague,” Amazon once again revoked plaintiff’s ability to post book reviews and deleted his prior reviews from the platform.

Plaintiff sued Amazon alleging breach of contract and violation of Washington’s Consumer Protection Act (CPA), and seeking a request for a declaratory judgment saying Section 230 of the Communications Decency Act should not protect Amazon. Plaintiff asserted that Amazon wrongfully removed plaintiff’s reviews and did not adequately explain its actions. The CPA violation centered on Amazon’s insufficient explanations and inconsistent policy enforcement. Amazon sought to dismiss the complaint, arguing there was no legal basis for the breach of contract claim, the other claim lacked merit, and that both the Section 230 and the First Amendment protect Amazon from liability. The court granted Amazon’s motion.

Breach of Contract Claim Tossed

The court noted that to win a breach of contract claim in Washington, plaintiff had to prove a contractual duty was imposed and breached, causing plaintiff to suffer damages. Plaintiff claimed that Amazon breached its contract by banning him from posting book reviews and asserted that Amazon’s Conditions and Guidelines were ambiguous. But the court found that Amazon’s Conditions and Guidelines gave Amazon the exclusive right to remove content or revoke user privileges at its discretion, and that plaintiff’s claim sought to hold Amazon responsible for actions the contract permitted. Similarly, the court found plaintiff’s claims for both breach of contract and breach of the implied duty of good faith and fair dealing to be baseless, as they failed to identify any specific contractual duty Amazon allegedly violated.

No Violation of Washington Consumer Protection Act

To be successful under Washington’s Consumer Protection Act, plaintiff would have had to allege five elements, including an unfair or deceptive act and a public interest impact. The court found that plaintiff’s claim against Amazon, based on the company’s decision to remove reviews, failed to establish an “unfair or deceptive act” since Amazon’s Conditions and Guidelines transparently allowed such actions, and plaintiff presented no evidence showing Amazon’s practices would mislead reasonable consumers. Additionally, plaintiff did not adequately demonstrate a public interest impact, as he did not provide evidence of a widespread pattern of behavior by Amazon or the potential harm to other users. Consequently, plaintiff’s claim was insufficient in two essential areas, rendering the CPA claim invalid.

Section 230 Also Saved the Day for Amazon

Amazon claimed immunity under Section 230(c)(1) of the Communications Decency Act (CDA) against plaintiff’s allegations under the CPA and for breach of the implied duty of good faith and fair dealing. Section 230 of the CDA protects providers of interactive computer services from liability resulting from third-party content (e.g., online messaging boards). For Amazon to receive immunity under this section, it had to show three things: it is an interactive computer service, it is treated by plaintiff as a publisher, and the information in dispute (the book reviews) was provided by another content provider. Given that Amazon met these conditions, the court determined that plaintiff’s claims against Amazon under Washington’s CPA and for breach of the implied duty were barred by Section 230 of the CDA.

As for plaintiff’s declaratory judgment claim regarding Section 230, the court found that since the Declaratory Judgment Act only offers a remedy and not a cause of action, and given the absence of a “substantial controversy,” the Court could not grant this declaratory relief. The court noted that its decision was further reinforced by the court’s conclusion that Section 230 did bar two of plaintiff’s claims.

Haywood v. Amazon.com, Inc., 2023 WL 4585362 (W.D. Washington, July 18, 2023)

See also:

Amazon and other booksellers off the hook for sale of Obama drug use book

Snapchat not liable for enabling teacher to groom minor student

A high school science teacher used Snapchat to send sexually explicit content to one of her students, whom she eventually assaulted. Authorities uncovered this abuse after the student overdosed on drugs. The student (as John Doe) sued the teacher, the school district and Snapchat. The lower court threw out the case against Snapchat on the basis of the federal Communications Decency Act at 47 USC § 230. The student sought review with the United States Court of Appeals for the Fifth Circuit. On appeal, the court affirmed.

Relying on Doe v. MySpace, Inc., 528 F.3d 413 (5th Cir. 2008), the court affirmed the lower court’s finding that the student’s claims against Snapchat were based on the teacher’s messages. Accordingly, Snapchat was immune from liability because this provision of federal law – under the doctrine of the MySpace case – provides “immunity … to Web-based service providers for all claims stemming from their publication of information created by third parties.”

Doe v. Snap, Inc., 2023 WL 4174061 (5th Cir. June 26, 2023)

Amazon gets Section 230 win over alleged defamatory product review


Customer ordered a scarf from plaintiffs’ Amazon store. Customer left a review claiming the scarf was not a real Burberry. When neither  customer nor Amazon would take down the review, plaintiffs (the Amazon store owners) sued for Amazon for defamation. The lower court dismissed on Section 230 grounds. Plaintiffs sought review with the Eleventh Circuit which affirmed the dismissal in a non-published opinion.

Section 230 (a provision in federal law found at 47 U.S.C. 230 which gives legal immunity to many online services) provides that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Because the lawsuit sought to treat Amazon (a provider of an interactive computer service) as the publisher of information (the product review) provided by another information content provider (customer), this immunity applied to protect Amazon from liability.

Specifically, the court held:

  • Amazon is an interactive computer service provider. Amazon’s website allows customers to view, purchase, and post reviews online, and therefore provides computer access by multiple users similar to an online message board or a website exchange system.
  • Amazon was not responsible for the development of the offending content. According to the complaint, defendant wrote the allegedly defamatory review, and therefore she functioned as the information content provider.
  • Roommates.com is not applicable, as the complaint here alleges that defendant wrote the review in its entirety.
  • Plaintiffs seek to hold Amazon liable for failing to take down defendant’s review, which is exactly the kind of claim that is immunized by Section 230 — one that treats Amazon as the publisher of that information

McCall v. Amazon, No. 22-11725 (11th Cir., June 12, 2023)

McCall_v_Amazon

Section 230 immunity did not protect Omegle in product liability lawsuit

Section 230

When plaintiff was 11 years old, she was connected to a man in his late thirties using Omegle (a “free online chat room that randomly pairs strangers from around the world for one-on-one chats”). Before the man was arrested some three years later, he forced plaintiff to send him pornographic videos of herself, made threats against her, and engaged in other inappropriate and unlawful conduct with plaintiff.

Plaintiff sued Omegle, alleging product liability and negligence relating to how Omegle was designed, and for failure to warn users of the site’s dangers. Omegle moved to dismiss these claims, claiming that it could not be liable because it was protected by 47 U.S.C. §230.

The court found that that Section 230 did not apply because plaintiff’s claims did not seek to treat Omegle as the publisher or speaker of content. The court observed that to meet the obligation plaintiff sought to impose on Omegle, Omegle would not have had to alter the content posted by its users. It would only have had to change its design and warnings.

And the court found that plaintiff’s claims did not rest on Omegle’s publication of third party content. In the same way that Snapchat did not avoid liability on the basis of Section 230 in Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021), Omegle’s alleged liability was based on its “own acts,” namely, designing and operating the service in a certain way that connected sex offenders with minors, and failed to warn of such dangers.

A.M. v. Omegle.com, LLC, 2022 WL 2713721 (D. Oregon, July 13, 2022)

See also:

Can a person be liable for retweeting a defamatory tweet?

section 230 user retweet defamatory

Under traditional principles of defamation law, one can be liable for repeating a defamatory statement to others. Does the same principle apply, however, on social media such as Twitter, where one can easily repeat the words of others via a retweet?

Hacking, tweet, retweet, lawsuit

A high school student hacked the server hosting the local middle school’s website, and modified plaintiff’s web page to make it appear she was seeking inappropriate relationships. Another student tweeted a picture of the modified web page, and several people retweeted that picture.

The teacher sued the retweeters for defamation and reckless infliction of emotional distress. The court dismissed the case, holding that 47 USC §230 immunized defendants from liability as “users” of an interactive computer service. Plaintiff sought review with the New Hampshire Supreme Court. On appeal, the court affirmed the dismissal.

Who is a “user” under Section 230?

Section 230 provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider”. Importantly, the statute does not define the word “user”. The lower court held that defendant retweeters fit into the category of “user” under the statute and therefore could not be liable for their retweeting, because to impose such liability would require treating them as the publisher or speaker of information provided by another.

Looking primarily at the plain language of the statute, and guided by the 2006 California case of Barrett v. Rosenthal, the state supreme court found no basis in plaintiff’s arguments that defendants were not “users” under the statute. Plaintiff had argued that “user” should be interpreted to mean libraries, colleges, computer coffee shops and others who, “at the beginning of the internet” were primary access points for people. And she also argued that because Section 230 changed common law defamation, the statute must speak directly to immunizing individual users.

The court held that it was “evident” that Section 230 abrogated the common law of defamation as applied to individual users. “That individual users are immunized from claims of defamation for retweeting content they did not create is evident from the statutory language. ”

Banaian v. Bascom, — A.3d —, 2022 WL 1482521 (N.H. May 11, 2022)

See also:

Omegle protected by Section 230 against claims for child pornography, sex trafficking and related claims

Section 230 sex trafficking

Omegle is a notorious website where you can be randomly placed in a chat room (using video, audio and text) with strangers on the internet. Back in March 2020, 11-year-old C.H. was using Omegle and got paired with a pedophile who intimidated her into disrobing on camera while he captured video. When C.H.’s parents found out, they sued Omegle alleging a number of theories:

  • possession of child pornography in violation of 18 U.S.C. § 2252A;
  • violation of the Federal Trafficking Victims Protection Act, 18 U.S.C. §§ 1591 and 1595;
  • violation of the Video Privacy Protection Act, 18 U.S.C. § 2710;
  • intrusion upon seclusion;
  • negligence;
  • intentional infliction of emotional distress;
  • ratification/vicarious liability; and
  • public nuisance

The court granted Omegle’s motion to dismiss all eight claims, holding that each of the claims was barred by the immunity provided under 47 U.S.C. § 230. Citing to Doe v. Reddit, Inc., 2021 WL 5860904 (C.D. Cal. Oct. 7, 2021) and Roca Labs, Inc. v. Consumer Op. Corp., 140 F. Supp. 3d 1311 (M.D. Fla. 2015), the court observed that a defendant seeking to enjoy the immunity provided by Section 230 must establish that: (1) defendant is a service provider or user of an interactive computer service; (2) the causes of action treat defendant as a publisher or speaker of information; and (3) a different information content provider provided the information.

Omegle met Section 230’s definition of “interactive computer service”

The court found Omegle to be an interactive computer service provider because there were no factual allegations suggesting that Omegle authored, published or generated its own information to warrant classifying it as an information content provider. Nor were there any factual allegations that Omegle materially contributed to the unlawfulness of the content at issue by developing or augmenting it. Omegle users were not required to provide or verify user information before being placed in a chatroom with another user. And some users, such as hackers and “cappers”, could circumvent other users’ anonymity using the data they themselves collected from those other users.

Plaintiffs’ claims sought to treat Omegle as a publisher or speaker of information

The court found that each of the claims for possession of child pornography, sex trafficking, violation of the Video Privacy Protection Act, intrusion upon seclusion and intentional infliction of emotional distress sought redress for damages caused by the unknown pedophile’s conduct. Specifically, in the court’s view, no well-pleaded facts suggested that Omegle had actual knowledge of the sex trafficking venture involving C.H. or that Omegle had an active participation in the venture. As for the claims of intentional infliction of emotional distress, ratification/vicarious liability and public nuisance, the court similarly concluded that plaintiffs’ theories of liability were rooted in Omegle’s creation and maintenance of the site. The court observed that plaintiffs’ claims recognized the distinction between Omegle as an interactive computer service provider and its users, but nonetheless treated Omegle as the publisher responsible for the conduct at issue. The court found this was corroborated by the “ratification/vicarious liability” claim, in which plaintiffs maintained that child sex trafficking was so pervasive on and known to Omegle that it should have been vicariously liable for the damages caused by such criminal activity. And, in the court’s view, through the negligence and public nuisance claims, plaintiffs alleged that Omegle knew or should have known about the dangers that the platform posed to minor children, and that Omegle failed to ensure that minor children did not fall prey to child predators that may use the website.

The information at issue was provided by a third party

On this third element, the court found that Omegle merely provided the forum where harmful conduct took place. The content giving rise to the harm – the video and the intimidation – were undertaken by the unknown pedophile, not Omegle.

Special note: Section 230 and the sex trafficking claim

Section 230 (e)(5) limits an interactive computer service provider’s immunity in certain circumstances involving claims of sex trafficking. In this case, however, like the court did in the case of Doe v. Kik Interactive, Inc., 482 F. Supp. 3d 1242 (S.D. Fla. 2020), the court held that Omegle’s Section 230 immunity remained intact, because the plaintiffs’ allegations were premised upon general, constructive knowledge of past sex trafficking incidents. The complaint failed to sufficiently allege Omegle’s actual knowledge or overt participation in the underlying incidents between C.H. and the unknown pedophile.

M.H. and J.H. v. Omegle.com, LLC, 2022 WL 93575 (M.D. Fla. January 10, 2022)

Section 230 did not protect Snapchat from negligence liability in car crash lawsuit over Speed Filter

The tragic facts

Landen Brown was using Snapchat’s Speed Filter in 2017 when the car in which he was riding with two other young men crashed after reaching speeds above 120 miles per hour. The Speed Filter documented how fast the car was traveling. The crash killed Landon and the other two occupants.

Section 230

The parents of two of the passengers sued Snap, Inc. (the purveyor of Snapchat), claiming that the app was negligently designed. The parents alleged, among other things, that Snap should have known that users believed they would be rewarded within the app for using the filter to record a speed above 100 miles per hour. The negligence claim was based in part on the notion that Snap did not remove or restrict access to Snapchat while traveling at dangerous speeds.

Immune, then not

The lower court dismissed the case, holding that 47 U.S.C. 230 protected Snapchat from liability. Plaintiffs sought review with the Ninth Circuit. On appeal, the court reversed, finding that Section 230 liability did not apply to the negligent design claim.

Section 230’s role in the case

Section 230 provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In this case, the court held that the parents’ complaint did not seek to hold Snap liable for its conduct as a publisher or speaker. The negligent design lawsuit treated Snap as a products manufacturer, accusing it of negligently designing a product (Snapchat) with a defect (the interplay between Snapchat’s reward system and the Speed Filter). Thus, the duty that Snap allegedly violated sprang from its distinct capacity as a product designer. Simply stated, in the court’s view, Snap’s alleged duty in this case case had nothing to do with its editing, monitoring, or removing of the content that its users generate through Snapchat.

Lemmon v. Snap, Inc. — F.3d —, 2021 WL 1743576 (9th Cir. May 4, 2021)

Scroll to top