Section 230 protected Meta from claims of discrimination for taking down Palestinian content

meta section 230

Pro se plaintiff sued Meta seeking to hold it liable for allegedly removing certain “Muslim and/or Palestinian content” while preserving “unspecified Jewish and/or Israeli content” and for allegedly banning certain Muslim users, while allowing unspecified Jewish users to continue using Meta’s services. He brought a civil rights claim for unlawful discrimination on the basis of religion in violation of  Title II of the Civil Rights Act of 1964.

Meta moved to dismiss, arguing, among other things, that plaintiff lacked standing. The lower court granted the motion. Plaintiff sought review with the Third Circuit. On appeal, the court affirmed the dismissal.

No “informational injury”

The court observed that plaintiff had not alleged that he owned, created, controlled or had any personal involvement with the removed content other than having previously viewed it. Nor had he alleged any personal involvement with the banned users. Likewise, he had not argued that he was denied the same level of service that Meta offered to all its users. Instead, he had argued that he was entitled to relief as a Muslim being discriminated against by having Muslim-related news removed while Jewish content remained.

The court examined whether plaintiff could establish standing under the “information injury” doctrine. To establish standing under the informational injury doctrine, plaintiff “need[ed] only allege that [he] was denied information to which [he] was legally entitled, and that the denial caused some adverse consequence related to the purpose of the statute.” It went on to note that an entitlement to information allegedly withheld is the “sine qua non” of the informational injury doctrine.

It held that plaintiff had failed to establish standing under this doctrine because he did not show that he was legally entitled to the publication of the requested content or the removal of other content. Title II does not create a right to information. And the statute could not be understood as granting him a right to relief because he did not allege that he was personally denied the full and equal enjoyment of Meta’s services. Moreover, plaintiff was without relief under Title II because the statute is limited to physical structures of accommodations, and Meta, for purposes of the statute was not a “place of public accommodation.”

Section 230 Classics

And in any event, 47 U.S.C. § 230 precluded the court from entertaining these claims, which would have sought to hold Meta liable for its exercise of a publisher’s traditional editorial functions – such as deciding whether to publish, withdraw, postpone, or alter content. On this point, the court looked to the classic Section 230 holdings in Green v. America Online (AOL), 318 F.3d 465,(3d Cir. 2003) and Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997).

Elansari v. Meta, Inc., 2024 WL 163080 (3d. Cir. January 16, 2024) (Not selected for official publication)

See also:

 

Fifth Circuit dissent issues scathing rebuke of broad Section 230 immunity

section 230 immunity

Dissenting in the court’s refusal to rehear an appeal en banc, Judge Elrod of the Fifth Circuit Court of Appeals – joined by six of her colleagues – penned an opinion that sharply criticized the broad immunity granted to social media companies under Section 230 of the Communications Decency Act. The dissent emerged in a case involving John Doe, a minor who was sexually abused by his high school teacher, a crime in which the messaging app Snapchat played a pivotal role.

The Core of the Controversy

Section 230 (47 U.S.C. 230) is a provision that courts have long held to shield internet companies from liability for content posted by their users. The dissenting opinion, however, argues that this immunity has been stretched far beyond its intended scope, potentially enabling platforms to evade responsibility even when their design and operations contribute to illegal activities.

Snapchat’s Role in the Abuse Case

Snapchat, owned by Snap, Inc., was used by the teacher to send sexually explicit material to Doe. Doe sought to hold Snap accountable, alleging that Snapchat’s design defects, such as inadequate age-verification mechanisms, indirectly facilitated the abuse. But the lower court, applying previous cases interpreting Section 230, dismissed these claims at the initial stage.

A Critical Examination of Section 230

The dissent criticized the court’s interpretation of Section 230, arguing that it has been applied too broadly to protect social media companies from various forms of liability, including design defects and distributor responsibilities. It highlighted the statute’s original text, which was meant to protect platforms from being deemed publishers or speakers of third-party content, not to shield them from liability for their own conduct.

Varied Interpretations Across Courts

Notably, the dissent pointed out the inconsistency in judicial interpretations of Section 230. While some courts, like the Ninth Circuit, have allowed claims related to design defects to proceed, others have extended sweeping protections to platforms, significantly limiting the scope for holding them accountable.

The Implications for Internet Liability

This case and the resulting dissent underscore a significant legal issue in the digital age: how to balance the need to protect online platforms from excessive liability with ensuring they do not become facilitators of illegal or harmful activities. The dissent suggested that the current interpretation of Section 230 has tipped this balance too far in favor of the platforms, leaving victims like Doe without recourse.

Looking Ahead: The Need for Reevaluation

The dissenting opinion called for a reevaluation of Section 230, urging a return to the statute’s original text and intent. This reexamination – in the court’s view – would be crucial in the face of evolving internet technologies and the increasing role of social media platforms in everyday life. The dissent warned of the dangers of a legal framework that overly shields these powerful platforms while leaving individuals exposed to the risks associated with their operations.

Conclusion

The court’s dissent in this case is a clarion call for a critical reassessment of legal protections afforded to social media platforms. As the internet continues to evolve, the legal system must adapt to ensure that the balance between immunity and accountability is appropriately maintained, safeguarding individuals’ rights without stifling technological innovation and freedom of expression online.

Doe through Roe v. Snap, Incorporated, — F4th — 2023 WL 8705665, (5th Cir., December 18, 2023)

See also: Snapchat not liable for enabling teacher to groom minor student

Court allows Amazon to censor “Wuhan plague” book reviews

amazon book reviews

In 2015, plaintiff began posting book reviews on Amazon, but in 2019 Amazon revoked his review privileges due to guideline violations, including reviews that criticized Donald Trump and two authors. After arbitration in 2020 favored Amazon, plaintiff and Amazon reached a settlement allowing plaintiff to post reviews if he adhered to Amazon’s policies. However, in 2022, after posting reviews derogatory of millennials and referring to COVID-19 as the “Wuhan plague,” Amazon once again revoked plaintiff’s ability to post book reviews and deleted his prior reviews from the platform.

Plaintiff sued Amazon alleging breach of contract and violation of Washington’s Consumer Protection Act (CPA), and seeking a request for a declaratory judgment saying Section 230 of the Communications Decency Act should not protect Amazon. Plaintiff asserted that Amazon wrongfully removed plaintiff’s reviews and did not adequately explain its actions. The CPA violation centered on Amazon’s insufficient explanations and inconsistent policy enforcement. Amazon sought to dismiss the complaint, arguing there was no legal basis for the breach of contract claim, the other claim lacked merit, and that both the Section 230 and the First Amendment protect Amazon from liability. The court granted Amazon’s motion.

Breach of Contract Claim Tossed

The court noted that to win a breach of contract claim in Washington, plaintiff had to prove a contractual duty was imposed and breached, causing plaintiff to suffer damages. Plaintiff claimed that Amazon breached its contract by banning him from posting book reviews and asserted that Amazon’s Conditions and Guidelines were ambiguous. But the court found that Amazon’s Conditions and Guidelines gave Amazon the exclusive right to remove content or revoke user privileges at its discretion, and that plaintiff’s claim sought to hold Amazon responsible for actions the contract permitted. Similarly, the court found plaintiff’s claims for both breach of contract and breach of the implied duty of good faith and fair dealing to be baseless, as they failed to identify any specific contractual duty Amazon allegedly violated.

No Violation of Washington Consumer Protection Act

To be successful under Washington’s Consumer Protection Act, plaintiff would have had to allege five elements, including an unfair or deceptive act and a public interest impact. The court found that plaintiff’s claim against Amazon, based on the company’s decision to remove reviews, failed to establish an “unfair or deceptive act” since Amazon’s Conditions and Guidelines transparently allowed such actions, and plaintiff presented no evidence showing Amazon’s practices would mislead reasonable consumers. Additionally, plaintiff did not adequately demonstrate a public interest impact, as he did not provide evidence of a widespread pattern of behavior by Amazon or the potential harm to other users. Consequently, plaintiff’s claim was insufficient in two essential areas, rendering the CPA claim invalid.

Section 230 Also Saved the Day for Amazon

Amazon claimed immunity under Section 230(c)(1) of the Communications Decency Act (CDA) against plaintiff’s allegations under the CPA and for breach of the implied duty of good faith and fair dealing. Section 230 of the CDA protects providers of interactive computer services from liability resulting from third-party content (e.g., online messaging boards). For Amazon to receive immunity under this section, it had to show three things: it is an interactive computer service, it is treated by plaintiff as a publisher, and the information in dispute (the book reviews) was provided by another content provider. Given that Amazon met these conditions, the court determined that plaintiff’s claims against Amazon under Washington’s CPA and for breach of the implied duty were barred by Section 230 of the CDA.

As for plaintiff’s declaratory judgment claim regarding Section 230, the court found that since the Declaratory Judgment Act only offers a remedy and not a cause of action, and given the absence of a “substantial controversy,” the Court could not grant this declaratory relief. The court noted that its decision was further reinforced by the court’s conclusion that Section 230 did bar two of plaintiff’s claims.

Haywood v. Amazon.com, Inc., 2023 WL 4585362 (W.D. Washington, July 18, 2023)

See also:

Amazon and other booksellers off the hook for sale of Obama drug use book

Snapchat not liable for enabling teacher to groom minor student

A high school science teacher used Snapchat to send sexually explicit content to one of her students, whom she eventually assaulted. Authorities uncovered this abuse after the student overdosed on drugs. The student (as John Doe) sued the teacher, the school district and Snapchat. The lower court threw out the case against Snapchat on the basis of the federal Communications Decency Act at 47 USC § 230. The student sought review with the United States Court of Appeals for the Fifth Circuit. On appeal, the court affirmed.

Relying on Doe v. MySpace, Inc., 528 F.3d 413 (5th Cir. 2008), the court affirmed the lower court’s finding that the student’s claims against Snapchat were based on the teacher’s messages. Accordingly, Snapchat was immune from liability because this provision of federal law – under the doctrine of the MySpace case – provides “immunity … to Web-based service providers for all claims stemming from their publication of information created by third parties.”

Doe v. Snap, Inc., 2023 WL 4174061 (5th Cir. June 26, 2023)

Amazon gets Section 230 win over alleged defamatory product review


Customer ordered a scarf from plaintiffs’ Amazon store. Customer left a review claiming the scarf was not a real Burberry. When neither  customer nor Amazon would take down the review, plaintiffs (the Amazon store owners) sued for Amazon for defamation. The lower court dismissed on Section 230 grounds. Plaintiffs sought review with the Eleventh Circuit which affirmed the dismissal in a non-published opinion.

Section 230 (a provision in federal law found at 47 U.S.C. 230 which gives legal immunity to many online services) provides that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Because the lawsuit sought to treat Amazon (a provider of an interactive computer service) as the publisher of information (the product review) provided by another information content provider (customer), this immunity applied to protect Amazon from liability.

Specifically, the court held:

  • Amazon is an interactive computer service provider. Amazon’s website allows customers to view, purchase, and post reviews online, and therefore provides computer access by multiple users similar to an online message board or a website exchange system.
  • Amazon was not responsible for the development of the offending content. According to the complaint, defendant wrote the allegedly defamatory review, and therefore she functioned as the information content provider.
  • Roommates.com is not applicable, as the complaint here alleges that defendant wrote the review in its entirety.
  • Plaintiffs seek to hold Amazon liable for failing to take down defendant’s review, which is exactly the kind of claim that is immunized by Section 230 — one that treats Amazon as the publisher of that information

McCall v. Amazon, No. 22-11725 (11th Cir., June 12, 2023)

McCall_v_Amazon

Intellectual property exception to CDA 230 immunity did not apply in case against Google

Plaintiff sued Google for false advertising and violations of New Jersey’s Consumer Fraud Act over Google’s provision of Adwords services for other defendants’ website, which plaintiff claimed sold counterfeit versions of plaintiff’s products. Google moved to dismiss these two claims and the court granted the motion. 

On the false advertising issue, the court held that plaintiff had failed to allege the critical element that Google was the party that made the alleged misrepresentations concerning the counterfeit products. 

As for the Consumer Fraud Act claim, the court held that Google enjoyed immunity from such a claim in accordance with the Communications Decency Act at 47 U.S.C. 230(c). 

Specifically, the court found: (1) Google’s services made Google the provider of an interactive computer service, (2) the claim sought to hold Google liable for the publishing of the offending ads, and (3) the offending ads were published by a party other than Google, namely, the purveyor of the allegedly counterfeit goods. CDA immunity applied because all three of these elements were met. 

The court rejected plaintiff’s argument that the New Jersey Consumer Fraud Act was an intellectual property statute and that therefore under Section 230(e)(2), CDA immunity did not apply. With immunity present, the court dismissed the consumer fraud claim. 

InvenTel Products, LLC v. Li, No. 19-9190, 2019 WL 5078807 (D.N.J. October 10, 2019)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown@internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Section 230 protected Twitter from liability for deleting Senate candidate’s accounts

Plaintiff (an Arizona senate candidate) sued Twitter after it suspended four of plaintiff’s accounts. He brought claims for (1) violation of the First Amendment; (2) violation of federal election law; (3) breach of contract; (4) conversion, (5) antitrust; (6) negligent infliction of emotional distress; (7) tortious interference; and (8) promissory estoppel.

Twitter moved to dismiss on multiple grounds, including that Section 230(c)(1) of the Communications Decency Act (“CDA”), 47 U.S.C. § 230, rendered it immune from liability for each of plaintiff’s claims that sought to treat it as a publisher of third-party content.

The CDA protects from liability (1) any provider of an interactive computer service (2) whom a plaintiff seeks to treat as a publisher or speaker (3) of information provided by another information content provider.

The court granted the motion to dismiss, on Section 230 grounds, all of the claims except the antitrust claim (which it dismissed for other reasons). It held that Twitter is a provider of an interactive computer service. And plaintiff sought to treat Twitter as a publisher or speaker by trying to pin liability on it for deleting accounts, which is a quintessential activity of a publisher. The deleted accounts were comprised of information provided by another information content provider (i.e., not Twitter, but plaintiff himself).

Brittain v. Twitter, 2019 WL 2423375 (N.D. Cal. June 10, 2019)

Section 230 protected Google in lawsuit over blog post

Defendant used Google’s Blogger service to write a post – about plaintiffs’ business practices – that plaintiffs found objectionable. So plaintiffs sued Google in federal court for defamation, tortious interference with a business relationship, and intentional infliction of emotional distress. The lower court dismissed the case on grounds that the Communications Decency Act (at 47 U.S.C. §230) immunized Google from liability for the publication of third party content.

Plaintiffs sought review with the U.S. Court of Appeals for the District of Columbia. On appeal, the court affirmed the dismissal. Applying a three part test the court developed in Klayman v. Zuckerberg, 753 F.3d 1354 (D.C. Cir. 2014) (which in turn applied analysis from the leading case of Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997)), the court held that Section 230 entitled Google to immunity because: (1) Google was a “provider or user of an interactive computer service,” (2) the relevant blog post contained “information provided by another information content provider,” and (3) the complaint sought to hold Google liable as “the publisher or speaker” of the blog post.

The court rejected defendant’s argument that in establishing and enforcing its Blogger Content Policy, Google influenced and thereby created the content it published. It held that Google’s role was strictly one of “output control” – because Google’s choice was limited to a “yes” or a “no” decision whether to remove the post, its action constituted “the very essence of publishing.” Since Section 230 immunizes online defendants against complaints seeking to hold them as the publisher of content, the lower court properly dismissed the action.

Bennett v. Google, LLC, 882 F.3d 1163 (D.C. Cir., February 23, 2018)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Anti-malware provider immune under CDA for calling competitor’s product a security threat

kills_bugs

Plaintiff anti-malware software provider sued defendant – who also provides software that protects internet users from malware, adware etc. – bringing claims for false advertising under the Section 43(a) of Lanham Act, as well as other business torts. Plaintiff claimed that defendant wrongfully revised its software’s criteria to identify plaintiff’s software as a security threat when, according to plaintiff, its software is “legitimate” and posed no threat to users’ computers.

Defendant moved to dismiss the complaint for failure to state a claim upon which relief may be granted. It argued that the provisions of the Communications Decency Act at Section 230(c)(2) immunized it from plaintiff’s claims.

Section 230(c)(2) reads as follows:

No provider or user of an interactive computer service shall be held liable on account of—

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in [paragraph (A)].

Specifically, defendant argued that the provision of its software using the criteria it selected was an action taken to make available to others the technical means to restrict access to malware, which is objectionable material.

The court agreed with defendant’s argument that the facts of this case were “indistinguishable” from the Ninth Circuit’s opinion in in Zango, Inc. v. Kaspersky, 568 F.3d 1169 (9th Cir. 2009), in which the court found that Section 230 immunity applied in the anti-malware context.

Here, plaintiff had argued that immunity should not apply because malware is not within the scope of “objectionable” material that it is okay to seek to filter in accordance with 230(c)(2)(B). Under plaintiff’s theory, malware is “not remotely related to the content categories enumerated” in Section 230(c)(2)(A), which (B) refers to. In other words, the objectionableness of malware is of a different nature than the objectionableness of material that is obscene, lewd, lascivious, filthy, excessively violent, harassing. The court rejected this argument on the basis that the determination of whether something is objectionable is up to the provider’s discretion. Since defendant found plaintiff’s software “objectionable” in accordance with its own judgment, the software qualifies as “objectionable” under the statute.

Plaintiff also argued that immunity should not apply because defendant’s actions taken to warn of plaintiff’s software were not taken in good faith. But the court applied the plain meaning of the statute to reject this argument – the good faith requirement only applies to conduct under Section 230(c)(2)(A), not (c)(2)(B).

Finally, plaintiff had argued that immunity should not apply with respect to its Lanham Act claim because of Section 230(e)(2), which provides that “nothing in [Section 230] shall be construed to limit or expand any law pertaining to intellectual property.” The court rejected this argument because although the claim was brought under the Lanham Act, which includes provisions concerning trademark infringement (which clearly relates to intellectual property), the nature of the Lanham Act claim here was for unfair competition, which is not considered to be an intellectual property claim.

Enigma Software Group v. Malwarebytes Inc., 2017 WL 5153698 (N.D. Cal., November 7, 2017)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Google and YouTube protected by Section 230

The case of Weerahandi v. Shelesh is a classic example of how Section 230 (a provision of the Communications Decency Act (CDA), found at 47 USC 230) shielded online intermediaries from alleged tort liability occasioned by their users.

Background Facts

Plaintiff was a YouTuber and filed a pro se lawsuit for, among other things, defamation, against a number of other YouTubers as well as Google and YouTube. The allegations arose from a situation back in 2013 in which one of the individual defendants sent what plaintiff believed to be a “false and malicious” DMCA takedown notice to YouTube. One of the defendants later took the contact information plaintiff had to provide in the counter-notification and allegedly disseminated that information to others who were alleged to have published additional defamatory YouTube videos.

Google and YouTube also got named as defendants for “failure to remove the videos” and for not taking “corrective action”. These parties moved to dismiss the complaint, claiming immunity under Section 230. The court granted the motion to dismiss.

Section 230’s Protections

Section 230 provides, in pertinent part that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). Section 230 also provides that “[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.” 47 U.S.C. § 230(e)(3).

The CDA also “proscribes liability in situations where an interactive service provider makes decisions ‘relating to the monitoring, screening, and deletion of content from its network.’ ” Obado v. Magedson, 612 Fed.Appx. 90, 94–95 (3d Cir. 2015). Courts have recognized Congress conferred broad immunity upon internet companies by enacting the CDA, because the breadth of the internet precludes such companies from policing content as traditional media have. See Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d 398, 407 (6th Cir. 2014); Batzel v Smith, 333 F.3d 1018, 1026 (9th Cir. 2003); Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th 1997); DiMeo v. Max, 433 F. Supp. 2d 523, 528 (E.D. Pa. 2006).

How Section 230 Applied Here

In this case, the court found that the CDA barred plaintiff’s claims against Google and YouTube. Both Google and YouTube were considered “interactive computer service[s].” Parker v. Google, Inc., 422 F. Supp. 2d 492, 551 (E.D. Pa. 2006). Plaintiff did not allege that Google or YouTube played any role in producing the allegedly defamatory content. Instead, Plaintiff alleged both websites failed to remove the defamatory content, despite his repeated requests.

Plaintiff did not cite any authority in his opposition to Google and YouTube’s motion, and instead argued that the CDA did not bar claims for the “failure to remove the videos” or to “take corrective action.” The court held that to the contrary, the CDA expressly protected internet companies from such liability. Under the CDA, plaintiff could not assert a claim against Google or YouTube for decisions “relating to the monitoring, screening, and deletion of content from its network. ” Obado, 612 Fed.Appx. at 94–95 (3d Cir. 2015); 47 U.S.C. § 230(c)(1) (“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”). For these reasons, the court found the CDA barred plaintiff’s claims against Google and YouTube.

Weerahandi v. Shelesh, 2017 WL 4330365 (D.N.J. September 29, 2017)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Scroll to top