Section 230 immunity protected provider of ringless voicemail services to telemarketers

Defendant telecommunication services provider provided ringless voicemail services and VoIP services to telemarketers. These services enabled telemarketers to mass deliver prerecorded messages directly to recipients’ voicemail inboxes without causing the recipients’ phones to ring or giving recipients the opportunity to answer or block the call.

The federal government sued a couple of telemarketers and defendant alleging violation of the FTC Act, which prohibits unfair or deceptive acts or practices in commerce. Defendant moved to dismiss the action, arguing that Section 230 provided it immunity from liability. The court granted the motion.

Section 230 immunity

Section 230(c) (at 47 U.S.C. 230(c)) provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Defendant asserted it met the criteria for Section 230 immunity because of (1) its role as an interactive computer service, (2) the way the government’s claims sought to treat it as a publisher or speaker of the allegedly unlawful calls, and (3) the potential liability was based on third party content (the calls being placed by the other telemarketing defendants).

Ringless voicemail services were an “interactive computer service”

The government argued defendant was not an “interactive computer service” because recipients accessed their voicemails through their telephones rather than a computer. The court rejected this argument, finding that defendant had shown that it transmitted content and provided access to multiple users to a computer server, thereby meeting the statutory definition of an interactive computer service.

Lawsuit sought to treat defendant as a publisher or speaker

The government next argued that its claims against defendant did not seek to treat defendant as the publisher or speaker of content, because defendant’s liability did not depend on the content of the transmitted messages. The court likewise rejected this argument as well because it was indeed the content that gave rise to liability – had the voicemails at issue not been for commercial purposes, they would not have been unlawful, and the matter would not have been brought in the first place.

Allegations related to the content of unlawful voicemails

Finally, as for the third element of Section 230 immunity – the offending content being provided by a third party – the court also sided with defendant. “While [defendant] developed the ringless voicemail technology at issue, that development goes to how the third-party content is distributed rather than the content itself.”

United States v. Stratics Networks Inc., 2024 WL 966380 (S.D. Cal., March 6, 2024)

See also:

Fifth Circuit dissent issues scathing rebuke of broad Section 230 immunity

section 230 immunity

Dissenting in the court’s refusal to rehear an appeal en banc, Judge Elrod of the Fifth Circuit Court of Appeals – joined by six of her colleagues – penned an opinion that sharply criticized the broad immunity granted to social media companies under Section 230 of the Communications Decency Act. The dissent emerged in a case involving John Doe, a minor who was sexually abused by his high school teacher, a crime in which the messaging app Snapchat played a pivotal role.

The Core of the Controversy

Section 230 (47 U.S.C. 230) is a provision that courts have long held to shield internet companies from liability for content posted by their users. The dissenting opinion, however, argues that this immunity has been stretched far beyond its intended scope, potentially enabling platforms to evade responsibility even when their design and operations contribute to illegal activities.

Snapchat’s Role in the Abuse Case

Snapchat, owned by Snap, Inc., was used by the teacher to send sexually explicit material to Doe. Doe sought to hold Snap accountable, alleging that Snapchat’s design defects, such as inadequate age-verification mechanisms, indirectly facilitated the abuse. But the lower court, applying previous cases interpreting Section 230, dismissed these claims at the initial stage.

A Critical Examination of Section 230

The dissent criticized the court’s interpretation of Section 230, arguing that it has been applied too broadly to protect social media companies from various forms of liability, including design defects and distributor responsibilities. It highlighted the statute’s original text, which was meant to protect platforms from being deemed publishers or speakers of third-party content, not to shield them from liability for their own conduct.

Varied Interpretations Across Courts

Notably, the dissent pointed out the inconsistency in judicial interpretations of Section 230. While some courts, like the Ninth Circuit, have allowed claims related to design defects to proceed, others have extended sweeping protections to platforms, significantly limiting the scope for holding them accountable.

The Implications for Internet Liability

This case and the resulting dissent underscore a significant legal issue in the digital age: how to balance the need to protect online platforms from excessive liability with ensuring they do not become facilitators of illegal or harmful activities. The dissent suggested that the current interpretation of Section 230 has tipped this balance too far in favor of the platforms, leaving victims like Doe without recourse.

Looking Ahead: The Need for Reevaluation

The dissenting opinion called for a reevaluation of Section 230, urging a return to the statute’s original text and intent. This reexamination – in the court’s view – would be crucial in the face of evolving internet technologies and the increasing role of social media platforms in everyday life. The dissent warned of the dangers of a legal framework that overly shields these powerful platforms while leaving individuals exposed to the risks associated with their operations.

Conclusion

The court’s dissent in this case is a clarion call for a critical reassessment of legal protections afforded to social media platforms. As the internet continues to evolve, the legal system must adapt to ensure that the balance between immunity and accountability is appropriately maintained, safeguarding individuals’ rights without stifling technological innovation and freedom of expression online.

Doe through Roe v. Snap, Incorporated, — F4th — 2023 WL 8705665, (5th Cir., December 18, 2023)

See also: Snapchat not liable for enabling teacher to groom minor student

Omegle protected by Section 230 against claims for child pornography, sex trafficking and related claims

Section 230 sex trafficking

Omegle is a notorious website where you can be randomly placed in a chat room (using video, audio and text) with strangers on the internet. Back in March 2020, 11-year-old C.H. was using Omegle and got paired with a pedophile who intimidated her into disrobing on camera while he captured video. When C.H.’s parents found out, they sued Omegle alleging a number of theories:

  • possession of child pornography in violation of 18 U.S.C. § 2252A;
  • violation of the Federal Trafficking Victims Protection Act, 18 U.S.C. §§ 1591 and 1595;
  • violation of the Video Privacy Protection Act, 18 U.S.C. § 2710;
  • intrusion upon seclusion;
  • negligence;
  • intentional infliction of emotional distress;
  • ratification/vicarious liability; and
  • public nuisance

The court granted Omegle’s motion to dismiss all eight claims, holding that each of the claims was barred by the immunity provided under 47 U.S.C. § 230. Citing to Doe v. Reddit, Inc., 2021 WL 5860904 (C.D. Cal. Oct. 7, 2021) and Roca Labs, Inc. v. Consumer Op. Corp., 140 F. Supp. 3d 1311 (M.D. Fla. 2015), the court observed that a defendant seeking to enjoy the immunity provided by Section 230 must establish that: (1) defendant is a service provider or user of an interactive computer service; (2) the causes of action treat defendant as a publisher or speaker of information; and (3) a different information content provider provided the information.

Omegle met Section 230’s definition of “interactive computer service”

The court found Omegle to be an interactive computer service provider because there were no factual allegations suggesting that Omegle authored, published or generated its own information to warrant classifying it as an information content provider. Nor were there any factual allegations that Omegle materially contributed to the unlawfulness of the content at issue by developing or augmenting it. Omegle users were not required to provide or verify user information before being placed in a chatroom with another user. And some users, such as hackers and “cappers”, could circumvent other users’ anonymity using the data they themselves collected from those other users.

Plaintiffs’ claims sought to treat Omegle as a publisher or speaker of information

The court found that each of the claims for possession of child pornography, sex trafficking, violation of the Video Privacy Protection Act, intrusion upon seclusion and intentional infliction of emotional distress sought redress for damages caused by the unknown pedophile’s conduct. Specifically, in the court’s view, no well-pleaded facts suggested that Omegle had actual knowledge of the sex trafficking venture involving C.H. or that Omegle had an active participation in the venture. As for the claims of intentional infliction of emotional distress, ratification/vicarious liability and public nuisance, the court similarly concluded that plaintiffs’ theories of liability were rooted in Omegle’s creation and maintenance of the site. The court observed that plaintiffs’ claims recognized the distinction between Omegle as an interactive computer service provider and its users, but nonetheless treated Omegle as the publisher responsible for the conduct at issue. The court found this was corroborated by the “ratification/vicarious liability” claim, in which plaintiffs maintained that child sex trafficking was so pervasive on and known to Omegle that it should have been vicariously liable for the damages caused by such criminal activity. And, in the court’s view, through the negligence and public nuisance claims, plaintiffs alleged that Omegle knew or should have known about the dangers that the platform posed to minor children, and that Omegle failed to ensure that minor children did not fall prey to child predators that may use the website.

The information at issue was provided by a third party

On this third element, the court found that Omegle merely provided the forum where harmful conduct took place. The content giving rise to the harm – the video and the intimidation – were undertaken by the unknown pedophile, not Omegle.

Special note: Section 230 and the sex trafficking claim

Section 230 (e)(5) limits an interactive computer service provider’s immunity in certain circumstances involving claims of sex trafficking. In this case, however, like the court did in the case of Doe v. Kik Interactive, Inc., 482 F. Supp. 3d 1242 (S.D. Fla. 2020), the court held that Omegle’s Section 230 immunity remained intact, because the plaintiffs’ allegations were premised upon general, constructive knowledge of past sex trafficking incidents. The complaint failed to sufficiently allege Omegle’s actual knowledge or overt participation in the underlying incidents between C.H. and the unknown pedophile.

M.H. and J.H. v. Omegle.com, LLC, 2022 WL 93575 (M.D. Fla. January 10, 2022)

Section 230 immunity protected Twitter from claims it aided and abetted defamation

Twitter enjoyed Section 230 immunity for aiding and abetting defamation because plaintiffs’ claims on that point did not transform Twitter into a party that created or developed content.

An anonymous Twitter user posted some tweets that plaintiffs thought were defamatory. So plaintiffs sued Twitter for defamation after Twitter refused to take the tweets down. Twitter moved to dismiss the lawsuit. It argued that the Communications Decency Act (CDA) at 47 U.S.C. §230 barred the claim. The court agreed that Section 230 provided immunity to Twitter, and granted the motion to dismiss.

The court applied the Second Circuit’s test for Section 230 immunity as set out in La Liberte v. Reid, 966 F.3d 79 (2d Cir. 2020). Under this test, which parses Section 230’s language, plaintiffs’ claims failed because:

  • (1) Twitter was a provider of an interactive computer service,
  • (2) the claims were based on information provided by another information content provider, and
  • (3) the claims treated Twitter as the publisher or speaker of that information.

Twitter is a provider of an interactive computer service

The CDA defines an “interactive computer service” as “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server.” 47 U.S.C. § 230(f)(2). The court found that Twitter is an online platform that allows multiple users to access and share the content hosted on its servers. As such, it is an interactive computer service for purposes of the CDA.

Plaintiffs’ claims were based on information provided by another information content provider

The court also found that the claims against Twitter were based on information provided by another information content provider. The CDA defines an “information content provider” as “any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.” 47 U.S.C. § 230(f)(3). In this case, the court found that plaintiffs’ claims were based on information created or developed by another information content provider – the unknown Twitter user who posted the alleged defamatory content. Plaintiffs did not allege that Twitter played any role in the “creation or development” of the challenged tweets.

The claim treated Twitter as the publisher or speaker of the alleged defamatory information

The court gave careful analysis to this third prong of the test. Plaintiffs alleged that Twitter had “allowed and helped” the unknown Twitter user to defame plaintiffs by hosting its tweets on its platform, or by refusing to remove those tweets when plaintiffs reported them. The court found that either theory would amount to holding Twitter liable as the “publisher or speaker” of “information provided by another information content provider.” The court observed that making information public and distributing it to interested parties are quintessential acts of publishing. Plaintiffs’ theory of liability would “eviscerate” Section 230 protection because it would hold Twitter liable simply for organizing and displaying content exclusively provided by third parties.

Similarly, the court concluded that holding Twitter liable for failing to remove the tweets plaintiffs found objectionable would also hold Twitter liable based on its role as a publisher of those tweets, because deciding whether or not to remove content falls squarely within the exercise of a publisher’s traditional role and is therefore subject to the CDA’s broad immunity.

The court found that plaintiffs’ suggestion that Twitter aided and abetted defamation by arranging and displaying others’ content on its platform failed to overcome Twitter’s immunity under the CDA. In the court’s view, such activity would be tantamount to holding Twitter responsible as the “developer” or “creator” of that content. But in reality, to impose liability on Twitter as a developer or creator of third-party content – rather than as a publisher of it – Twitter would have to directly and materially contribute to what made the content itself unlawful.

Plaintiffs in this case did not allege that Twitter contributed to the defamatory content of the tweets at issue, and thus pled no basis upon which Twitter could be held liable as the creator or developer of those tweets. Accordingly, plaintiffs’ defamation claims against Twitter also satisfied the final requirement for CDA immunity: the claims sought to hold Twitter, an interactive computer service, liable as the publisher of information provided by another information content provider. Ultimately, Twitter had Section 230 immunity for aiding and abetting defamation.

Brikman v. Twitter, Inc., 2020 WL 5594637 (E.D.N.Y., September 17, 2020)

See also:

Website avoided liability over user content thanks to Section 230

About the author:

Evan Brown, Copyright work made for hireEvan Brown is an attorney in Chicago practicing copyright, trademark, technology and in other areas of the law. His clients include individuals and companies in many industries, as well as the technology companies that serve them. Twitter: @internetcases

Need help with an online legal issue?

Let’s talk. Give me a call at (630) 362-7237, or send me an email at ebrown@internetcases.com.

Website avoided liability over user content thanks to Section 230

Section 230

Section 230 protected a website from liability over content its users posted, in a recent case from the Eighth Circuit Court of Appeals.

Plaintiff sued defendant website operator for trade libel. The website had a forum board, and two forum board users posted that plaintiff was under federal investigation.

Likely recognizing Section 230 immunity would be an obstacle, plaintiff pled certain key facts about the posts’ authors. It claimed the two authors were longtime users. And it noted defendant occasionally paid its users to generate content. So plaintiff claimed the authors were “volunteers, employees, servants, contractors or agents of [defendant].” According to plaintiff’s logic, this would make the defendant website operator the “information content provider” of the offending posts.

The court did not buy that argument. Neither did the appellate court. The Communications Decency Act (at 47 U.S.C. §230) immunizes providers of interactive computer services against liability arising from content created by third parties. The Act provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

The court held that these facts that plaintiff put forward did not plausibly show that defendant website operator was an information content provider. Since the facts only showed that independent parties created the offending posts, Section 230 immunity applied.

East Coast Test Prep LLC v. Allnurses.com, Inc., — F.3d —, 2020 WL 4809911 (8th Cir. August 19, 2020)

See also:

Let’s talk.

Do you need help with an internet law issue? Give me a call or drop me a line: (630) 362-7237 – ebrown@internetcases.com.

About the author

Evan Brown, Copyright work made for hireEvan Brown is an attorney in Chicago practicing copyright, trademark, technology and in other areas of the law. His clients include individuals and companies in many industries, as well as the technology companies that serve them. Twitter: @internetcases

Section 230 did not protect online car sharing platform

Plaintiff Turo operates an online and mobile peer-to-peer car sharing marketplace. It allows car owners to rent their cars to other Turo users. It filed a declaratory judgment action against the City of Los Angeles, asking the court to determine the service was not being run in violation of applicable law.

Section 230

The city filed counterclaims against Turo alleging (1) violation of local airport commerce regulations; (2) trespass; (3) aiding and abetting trespass; (4) unjust enrichment; and (5) unlawful and unfair business practices under California statute.

Should Section 230 apply?

Turo moved to dismiss. It argued that the City’s counterclaims sought to hold Turo liable for content published by users on Turo’s platform. In Turo’s mind, that should make it immune under Section 230. 

Section 230(c) provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Turo argued that the city’s claims were barred by Section 230 because they sought to hold Turo liable for its users’ actions. Those users published rental listings and selected LAX as the designated pickup point for car rentals. According to Turo, because the content of the rental listings were provided by third-party users, and because the city’s claims sought to hold Turo liable as an interactive computer service responsible for that content, Section 230 should apply.

No immunity, based on what the platform did

The court rejected Turo’s arguments that Section 230 immunized Turo from liability arising from the city’s counterclaims.

It held that Section 230 did not provide immunity because the city sought to hold Turo liable for its role facilitating online rental car transactions, not as the publisher or speaker of its users’ listings.

Citing to Force v. Facebook, Inc., 934 F.3d 53 (2d Cir. 2019), the court observed that “Section 230(c)(1) limits liability based on the function the defendant performs, not its identity.”

And the court compared the situation to the one in HomeAway.com, Inc. v. City of Santa Monica, 918 F.3d 676 (9th Cir. 2019). In that case, Section 230 did not immunize companies providing peer-to-peer home rental platform services from a government ordinance that required homeowners to register their properties with the city before listing them on a home sharing platform.

The court explained that Section 230 immunity did not apply because the government plaintiff did not seek to hold the platform companies liable for the content of the bookings posted by their users, but only for their actions of processing transactions for unregistered properties.

Turo v. City of Los Angeles, 2020 WL 3422262 (C.D. Cal., June 19, 2020)

Evan Brown is an intellectual property and technology attorney advising companies on issues relating to the internet and new technologies.

Have a question? Email Evan to set up a time to talk: ebrown@internetcases.com. Or call (630) 362-7237.

Section 230 protected Google in lawsuit over blog post

Defendant used Google’s Blogger service to write a post – about plaintiffs’ business practices – that plaintiffs found objectionable. So plaintiffs sued Google in federal court for defamation, tortious interference with a business relationship, and intentional infliction of emotional distress. The lower court dismissed the case on grounds that the Communications Decency Act (at 47 U.S.C. §230) immunized Google from liability for the publication of third party content.

Plaintiffs sought review with the U.S. Court of Appeals for the District of Columbia. On appeal, the court affirmed the dismissal. Applying a three part test the court developed in Klayman v. Zuckerberg, 753 F.3d 1354 (D.C. Cir. 2014) (which in turn applied analysis from the leading case of Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997)), the court held that Section 230 entitled Google to immunity because: (1) Google was a “provider or user of an interactive computer service,” (2) the relevant blog post contained “information provided by another information content provider,” and (3) the complaint sought to hold Google liable as “the publisher or speaker” of the blog post.

The court rejected defendant’s argument that in establishing and enforcing its Blogger Content Policy, Google influenced and thereby created the content it published. It held that Google’s role was strictly one of “output control” – because Google’s choice was limited to a “yes” or a “no” decision whether to remove the post, its action constituted “the very essence of publishing.” Since Section 230 immunizes online defendants against complaints seeking to hold them as the publisher of content, the lower court properly dismissed the action.

Bennett v. Google, LLC, 882 F.3d 1163 (D.C. Cir., February 23, 2018)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Anti-malware provider immune under CDA for calling competitor’s product a security threat

kills_bugs

Plaintiff anti-malware software provider sued defendant – who also provides software that protects internet users from malware, adware etc. – bringing claims for false advertising under the Section 43(a) of Lanham Act, as well as other business torts. Plaintiff claimed that defendant wrongfully revised its software’s criteria to identify plaintiff’s software as a security threat when, according to plaintiff, its software is “legitimate” and posed no threat to users’ computers.

Defendant moved to dismiss the complaint for failure to state a claim upon which relief may be granted. It argued that the provisions of the Communications Decency Act at Section 230(c)(2) immunized it from plaintiff’s claims.

Section 230(c)(2) reads as follows:

No provider or user of an interactive computer service shall be held liable on account of—

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in [paragraph (A)].

Specifically, defendant argued that the provision of its software using the criteria it selected was an action taken to make available to others the technical means to restrict access to malware, which is objectionable material.

The court agreed with defendant’s argument that the facts of this case were “indistinguishable” from the Ninth Circuit’s opinion in in Zango, Inc. v. Kaspersky, 568 F.3d 1169 (9th Cir. 2009), in which the court found that Section 230 immunity applied in the anti-malware context.

Here, plaintiff had argued that immunity should not apply because malware is not within the scope of “objectionable” material that it is okay to seek to filter in accordance with 230(c)(2)(B). Under plaintiff’s theory, malware is “not remotely related to the content categories enumerated” in Section 230(c)(2)(A), which (B) refers to. In other words, the objectionableness of malware is of a different nature than the objectionableness of material that is obscene, lewd, lascivious, filthy, excessively violent, harassing. The court rejected this argument on the basis that the determination of whether something is objectionable is up to the provider’s discretion. Since defendant found plaintiff’s software “objectionable” in accordance with its own judgment, the software qualifies as “objectionable” under the statute.

Plaintiff also argued that immunity should not apply because defendant’s actions taken to warn of plaintiff’s software were not taken in good faith. But the court applied the plain meaning of the statute to reject this argument – the good faith requirement only applies to conduct under Section 230(c)(2)(A), not (c)(2)(B).

Finally, plaintiff had argued that immunity should not apply with respect to its Lanham Act claim because of Section 230(e)(2), which provides that “nothing in [Section 230] shall be construed to limit or expand any law pertaining to intellectual property.” The court rejected this argument because although the claim was brought under the Lanham Act, which includes provisions concerning trademark infringement (which clearly relates to intellectual property), the nature of the Lanham Act claim here was for unfair competition, which is not considered to be an intellectual property claim.

Enigma Software Group v. Malwarebytes Inc., 2017 WL 5153698 (N.D. Cal., November 7, 2017)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

YouTube not liable for aiding ISIS in Paris attack

The Communications Decency Act provided immunity to Google in a suit brought against it by the family of an American college student killed in the November 2015 attack.

Plaintiffs filed suit against Google (as operator of YouTube) alleging violation of federal laws that prohibit providing material support to terrorists, arising from the November 2015 Paris attack that ISIS carried out. Plaintiffs argued that the YouTube platform, among other things, aided in recruitment and provided ISIS the means to distribute messages about its activities.

Google moved to dismiss the lawsuit, arguing that Section 230 of the Communications Decency Act (47 U.S.C. 230) provided immunity from suit. The court granted the motion to dismiss.

Section 230 Generally

Section 230(c) provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Accordingly, Section 230 precludes liability that treats a website as the publisher or speaker of information users provide on the website, protecting websites from liability for material posted on the website by someone else.

JASTA Did Not Repeal Section 230 Immunity

In response to Google’s arguments in favor of Section 230 immunity, plaintiffs first argued that a recent federal statute – the Justice Against Sponsors of Terrorism Act, or “JASTA” – effectively repealed the immunity conferred to interactive computer services by Section 230. Plaintiffs focused on language in the statute that stated that its purpose “is to provide civil litigants with the broadest possible basis, consistent with the Constitution of the United States, to seek relief” against terrorists and those who assist them.

The court rejected plaintiffs’ arguments that JASTA repealed Section 230 immunity. Significantly, the statute did not expressly repeal Section 230’s protections, nor did it do so implicitly by evincing any “clear and manifest” congressional intent to repeal any part of the Communications Decency Act.

Section 230 Need Not Be Applied Outside the United States

Plaintiffs also argued that Section 230 immunity did not arise because the Communications Decency Act should not apply outside the territorial jurisdiction of the United States. According to plaintiffs, Google provided support and resources to ISIS outside the United States (in Europe and the Middle East), ISIS’s use of Google’s resources was outside the United States, and the Paris attacks and plaintiffs’ relative’s death took place outside the United States.

The court rejected this argument, holding that Section 230’s focus is on limiting liability. The application of the statute to achieve that objective must occur where the immunity is needed, namely, at the place of litigation. Since the potential for liability, and the application of immunity was occurring in the United States, there was no need to apply Section 230 “extraterritorially”.

Immunity Protected Google

Google argued that plaintiffs’ claims sought to treat it as the publisher or speaker of the offending ISIS content, thus satisfying one of the requirements for Section 230 immunity. Plaintiffs countered that their lawsuit did not depend on the characterization of Google as the publisher or speaker of ISIS’s content, because their claims focused on Google’s violations of the federal criminal statutes that bar the provision of material support to terrorists.

But the court found that the conduct Google was accused of — among other things, failing to ensure that ISIS members who had been kicked off could not re-establish accounts — fit within the traditional editorial functions of a website operator. Accordingly, despite the plaintiffs’ characterization of its claims, the court found such claims to be an attempt to treat Google as the publisher or speaker of the ISIS videos.

The court similarly rejected plaintiffs’ arguments that Section 230 immunity should not apply because, by appending advertisements to some of the ISIS videos, Google became an “information content provider” itself, and thus responsible for the videos. This argument failed primarily because the content of the advertisements (which themselves were provided by third parties) did not contribute to the unlawfulness of the content of the videos.

Gonzalez v. Google, Inc., — F.Supp.3d —, 2017 WL 4773366 (N.D. Cal., October 23, 2017)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Yelp not liable for allegedly defamatory customer reviews

In a recent case having an outcome that should surprise no one, the United States Court of Appeals for the Ninth Circuit has affirmed a lower court’s decision that held Yelp immune from liability under the Communications Decency Act (47 U.S.C. 230 – the “CDA”) over customer reviews that were allegedly defamatory.

Plaintiff sued Yelp for violations under RICO and the Washington Consumer Protection Act, as well as libel under Washington law. Yelp moved to dismiss for failure to state to claim upon which relief may be granted. The lower court found that plaintiff had failed to allege any facts that plausibly suggested Yelp was responsible for the content, and therefore dismissed the case. Plaintiffs sought review with the Ninth Circuit. On appeal, the court affirmed.

The appellate court observed that plaintiff’s complaint, which he filed pro se, “pushed the envelope” of creative pleading. The court observed that plaintiff cryptically – “to the point of opacity” – alleged that Yelp was the one that created and developed the offending content. The court declined to open the door to such “artful skirting” of the Communications Decency Act’s safe harbor provision.

The key question before the court was whether the alleged defamatory reviews were provided by Yelp or by another information content provider. CDA immunity does not extend to situations where the web site itself is responsible for the creation or development of the offending content. The immunity protects providers or users of interactive computer services when the claims being made against them seek to treat them as a publisher or speaker of the information provided by another information content provider.

In this case, the court found that a careful reading of plaintiff’s complaint revealed that he never specifically alleged that Yelp created the content of the allegedly defamatory posts. Rather, plaintiff pled that Yelp adopted them from another website and transformed them into its own stylized promotions. The court found that these “threadbare” allegations of Yelp’s fabrication of allegedly defamatory statements were implausible on their face and were insufficient to avoid immunity under the Communications Decency Act. The court was careful to note that CDA immunity does not extend to content created or developed by an interactive computer service. “But the immunity in the CDA is broad enough to require plaintiffs alleging such a theory to state the facts plausibly suggesting the defendant fabricated content under a third party’s identity.”

The plaintiff had alleged in part that Yelp’s rating system and its use by the author of the allegedly defamatory content resulted in the creation or development of information by Yelp. The court rejected this argument, finding that the rating system did “absolutely nothing to enhance the defamatory sting of the message beyond the words offered by the user.” The court further observed that the star rating system was best characterized as a neutral tool operating on voluntary inputs that did not amount to content development or creation.

Finally, the court addressed plaintiff’s cryptic allegations that Yelp should be held liable for republishing the alleged defamatory content as advertisements or promotions on Google. A footnote in the opinion states that plaintiff was not clear whether the alleged republication was anything more than the passive indexing of Yelp reviews by the Google crawler. The decision’s final outcome, however, does not appear to depend on whether Google indexed that content as Yelp passively stood by or whether Yelp affirmatively directed the content to Google. “Nothing in the text of the CDA indicates that immunity turns on how many times an interactive computer service publishes information provided by another information content provider.” In the same way that Yelp would not be liable for posting user generated content on its web site, it would not be liable for disseminating the same content in essentially the same format to a search engine. “Simply put, proliferation and dissemination of content does not equal creation or development of content.”

Kimzey v. Yelp! Inc., — F.3d —, 2016 WL 4729492 (9th Cir. September 12, 2016)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Scroll to top