Google and YouTube protected by Section 230

The case of Weerahandi v. Shelesh is a classic example of how Section 230 (a provision of the Communications Decency Act (CDA), found at 47 USC 230) shielded online intermediaries from alleged tort liability occasioned by their users.

Background Facts

Plaintiff was a YouTuber and filed a pro se lawsuit for, among other things, defamation, against a number of other YouTubers as well as Google and YouTube. The allegations arose from a situation back in 2013 in which one of the individual defendants sent what plaintiff believed to be a “false and malicious” DMCA takedown notice to YouTube. One of the defendants later took the contact information plaintiff had to provide in the counter-notification and allegedly disseminated that information to others who were alleged to have published additional defamatory YouTube videos.

Google and YouTube also got named as defendants for “failure to remove the videos” and for not taking “corrective action”. These parties moved to dismiss the complaint, claiming immunity under Section 230. The court granted the motion to dismiss.

Section 230’s Protections

Section 230 provides, in pertinent part that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). Section 230 also provides that “[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.” 47 U.S.C. § 230(e)(3).

The CDA also “proscribes liability in situations where an interactive service provider makes decisions ‘relating to the monitoring, screening, and deletion of content from its network.’ ” Obado v. Magedson, 612 Fed.Appx. 90, 94–95 (3d Cir. 2015). Courts have recognized Congress conferred broad immunity upon internet companies by enacting the CDA, because the breadth of the internet precludes such companies from policing content as traditional media have. See Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d 398, 407 (6th Cir. 2014); Batzel v Smith, 333 F.3d 1018, 1026 (9th Cir. 2003); Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th 1997); DiMeo v. Max, 433 F. Supp. 2d 523, 528 (E.D. Pa. 2006).

How Section 230 Applied Here

In this case, the court found that the CDA barred plaintiff’s claims against Google and YouTube. Both Google and YouTube were considered “interactive computer service[s].” Parker v. Google, Inc., 422 F. Supp. 2d 492, 551 (E.D. Pa. 2006). Plaintiff did not allege that Google or YouTube played any role in producing the allegedly defamatory content. Instead, Plaintiff alleged both websites failed to remove the defamatory content, despite his repeated requests.

Plaintiff did not cite any authority in his opposition to Google and YouTube’s motion, and instead argued that the CDA did not bar claims for the “failure to remove the videos” or to “take corrective action.” The court held that to the contrary, the CDA expressly protected internet companies from such liability. Under the CDA, plaintiff could not assert a claim against Google or YouTube for decisions “relating to the monitoring, screening, and deletion of content from its network. ” Obado, 612 Fed.Appx. at 94–95 (3d Cir. 2015); 47 U.S.C. § 230(c)(1) (“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”). For these reasons, the court found the CDA barred plaintiff’s claims against Google and YouTube.

Weerahandi v. Shelesh, 2017 WL 4330365 (D.N.J. September 29, 2017)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Quora gets Section 230 victory in the Tenth Circuit

Pro se plaintiff Silver filed suit in federal court in New Mexico against the online question-and-answer website Quora, alleging that statements made by two different individuals concerning his professional services were defamatory. Quora moved to dismiss, arguing that the immunity provisions of the Communications Decency Act, at 47 U.S.C. 230 shielded it from liability arising from content posted by its users. The district court granted the motion to dismiss. Plaintiff sought review with the Tenth Circuit Court of Appeals. On review, the court affirmed the lower court’s dismissal of the case.

Citing to its previous Section 230 precedent, Ben Ezra, Weinstein, & Co. v. Am. Online Inc., 206 F.3d 980 (10th Cir. 2000), the court held that Quora was a provider of “an interactive computer service,” that its actions forming the basis of alleged liability, namely, in hosting the content, were that of a “publisher or speaker,” and that the content giving rise to the alleged liability was from “another information content provider,” i.e., the users who posted the content.

Silver v. Quora, Inc., 2016 WL 6892146 (10th Circuit, November 23, 2016)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Yelp not liable for allegedly defamatory customer reviews

In a recent case having an outcome that should surprise no one, the United States Court of Appeals for the Ninth Circuit has affirmed a lower court’s decision that held Yelp immune from liability under the Communications Decency Act (47 U.S.C. 230 – the “CDA”) over customer reviews that were allegedly defamatory.

Plaintiff sued Yelp for violations under RICO and the Washington Consumer Protection Act, as well as libel under Washington law. Yelp moved to dismiss for failure to state to claim upon which relief may be granted. The lower court found that plaintiff had failed to allege any facts that plausibly suggested Yelp was responsible for the content, and therefore dismissed the case. Plaintiffs sought review with the Ninth Circuit. On appeal, the court affirmed.

The appellate court observed that plaintiff’s complaint, which he filed pro se, “pushed the envelope” of creative pleading. The court observed that plaintiff cryptically – “to the point of opacity” – alleged that Yelp was the one that created and developed the offending content. The court declined to open the door to such “artful skirting” of the Communications Decency Act’s safe harbor provision.

The key question before the court was whether the alleged defamatory reviews were provided by Yelp or by another information content provider. CDA immunity does not extend to situations where the web site itself is responsible for the creation or development of the offending content. The immunity protects providers or users of interactive computer services when the claims being made against them seek to treat them as a publisher or speaker of the information provided by another information content provider.

In this case, the court found that a careful reading of plaintiff’s complaint revealed that he never specifically alleged that Yelp created the content of the allegedly defamatory posts. Rather, plaintiff pled that Yelp adopted them from another website and transformed them into its own stylized promotions. The court found that these “threadbare” allegations of Yelp’s fabrication of allegedly defamatory statements were implausible on their face and were insufficient to avoid immunity under the Communications Decency Act. The court was careful to note that CDA immunity does not extend to content created or developed by an interactive computer service. “But the immunity in the CDA is broad enough to require plaintiffs alleging such a theory to state the facts plausibly suggesting the defendant fabricated content under a third party’s identity.”

The plaintiff had alleged in part that Yelp’s rating system and its use by the author of the allegedly defamatory content resulted in the creation or development of information by Yelp. The court rejected this argument, finding that the rating system did “absolutely nothing to enhance the defamatory sting of the message beyond the words offered by the user.” The court further observed that the star rating system was best characterized as a neutral tool operating on voluntary inputs that did not amount to content development or creation.

Finally, the court addressed plaintiff’s cryptic allegations that Yelp should be held liable for republishing the alleged defamatory content as advertisements or promotions on Google. A footnote in the opinion states that plaintiff was not clear whether the alleged republication was anything more than the passive indexing of Yelp reviews by the Google crawler. The decision’s final outcome, however, does not appear to depend on whether Google indexed that content as Yelp passively stood by or whether Yelp affirmatively directed the content to Google. “Nothing in the text of the CDA indicates that immunity turns on how many times an interactive computer service publishes information provided by another information content provider.” In the same way that Yelp would not be liable for posting user generated content on its web site, it would not be liable for disseminating the same content in essentially the same format to a search engine. “Simply put, proliferation and dissemination of content does not equal creation or development of content.”

Kimzey v. Yelp! Inc., — F.3d —, 2016 WL 4729492 (9th Cir. September 12, 2016)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Twitter avoids liability in terrorism lawsuit

Update 1/31/2018: The Ninth Circuit upheld the court’s decision discussed below.

The families of two U.S. contractors killed in Jordan sued Twitter, accusing the platform of providing material support to the terrorist organization ISIS. Plaintiffs alleged that by allowing ISIS to create and maintain Twitter accounts, the company violated the Anti-Terrorism Act (ATA). Plaintiffs further claimed this support enabled ISIS to recruit, fundraise, and promote extremist propaganda, ultimately leading to the deaths of the contractors. The lawsuit aimed to hold Twitter responsible for the actions of ISIS and to penalize it for facilitating the organization’s digital presence.

Twitter moved to dismiss, arguing that the claims were barred under the Communications Decency Act (CDA) at 47 U.S.C. §230. Section 230 provides immunity to internet platforms from being treated as the publisher or speaker of content posted by third parties. The court had to decide whether Twitter’s role in allowing ISIS to use its platform made it liable for the consequences of ISIS’s acts.

The court dismissed the case, finding that Section 230 shielded Twitter from liability. The court ruled that plaintiffs’ claims attempted to treat Twitter as the publisher of content created by ISIS, which is precisely the type of liability Section 230 was designed to prevent. The court also concluded that plaintiffs failed to establish a plausible connection, or proximate causation, between Twitter’s actions and the deaths. Importantly, in the court’s view, plaintiffs could not demonstrate that ISIS’s use of Twitter directly caused the attack in Jordan or that the shooter had interacted with ISIS content on the platform.

The court further addressed plaintiffs’ argument regarding private messages sent through Twitter’s direct messaging feature. It ruled that these private communications were also protected under Section 230, as the law applies to all publishing activities, whether public or private.

Three reasons why this case matters:

  • Expanding the scope of Section 230: The case reinforced the broad immunity provided to tech companies under Section 230, including their handling of controversial or harmful content.
  • Clarifying proximate causation in ATA claims: The ruling highlighted the challenges of proving a direct causal link between a platform’s operations and acts of terrorism.
  • Balancing tech innovation and accountability: The decision underscored the ongoing debate about how to balance the benefits of open platforms with the need for accountability in preventing misuse.

Fields v. Twitter, Inc., 200 F. Supp. 3d 964 (N.D. Cal., August 10, 2016).

Immunity denied for website that failed to warn of dangerous activity

Jane Doe sued Internet Brands, the owner of the website Model Mayhem, for negligence after she was lured into a trap by two criminals who used the site to target victims. Plaintiff asked the court to hold Internet Brands liable for failing to warn users about the known threat posed by the criminals. The district court dismissed the case, finding the claim barred by the provision of the Communications Decency Act (CDA) found at 47 U.S.C. 230. However, the Ninth Circuit reversed that decision, holding that the CDA did not shield Internet Brands from liability for failing to warn.

Plaintiff, an aspiring model, joined Model Mayhem, a networking site for modeling professionals. In 2011, Plaintiff was contacted by individuals associated with the defendant, who posed as talent scouts and convinced her to travel to Florida for an audition. Once there, Plaintiff was drugged, raped, and recorded for pornography. The lawsuit revealed that Internet Brands had known since 2010 about these criminals and their use of the site to target victims but did not warn users.

Plaintiff argued that Internet Brands had a duty to warn users like her about the danger. Defendant argued that the CDA, which protects websites from liability as “publishers” of third-party content, barred the claim. Defendant claimed that issuing a warning would have effectively treated it as a publisher of user-generated content, a role protected under the CDA.

The court disagreed. It found that plaintiff’s claim did not depend on treating defendant as a publisher or speaker of third-party content. Instead, the claim arose from defendant’s alleged failure to act on its knowledge of the rapists’ activities. The court explained that the CDA does not provide blanket immunity for websites, especially when the obligation to warn does not require altering or removing user-generated content.

The Ninth Circuit reversed the district court’s dismissal and sent the case back for further proceedings, stating that the CDA did not block Plaintiff’s negligence claim.

Three reasons why this case matters:

  • Defining CDA Immunity: This decision clarified that the CDA does not protect websites from all legal claims, especially those unrelated to user-generated content.
  • Website Accountability: The case demonstrates that platforms can be held liable for failing to protect users from known risks.
  • Victim Protection: It shows that courts may balance user safety with the legal protections for online platforms.

Doe v. Internet Brands, Inc., 824 F.3d 846 (9th Cir., May 31, 2016)

Facebook’s Terms of Service protect it from liability for offensive fake account

0723_facebook-screenshot
Someone set up a bogus Facebook account and posted, without consent, images and video of Plaintiff engaged in a lewd act. Facebook finally deleted the account, but not until two days had passed and Plaintiff had threatened legal action.

Plaintiff sued anyway, alleging, among other things, intrusion upon seclusion, public disclosure of private facts, and infliction of emotional distress. In his complaint, Plaintiff emphasized language from Facebook’s Terms of Service that prohibited users from posting content or taking any action that “infringes or violates someone else’s rights or otherwise would violate the law.”

Facebook moved to dismiss the claims, making two arguments: (1) that the claims contradicted Facebook’s Terms of Service, and (2) that the claims were barred by the Communications Decency Act at 47 U.S.C. 230. The court granted the motion to dismiss.

It looked to the following provision from Facebook’s Terms of Service:

Although we provide rules for user conduct, we do not control or direct users’ actions on Facebook and are not responsible for the content or information users transmit or share on Facebook. We are not responsible for any offensive, inappropriate, obscene, unlawful or otherwise objectionable content or information you may encounter on Facebook. We are not responsible for the conduct, whether online or offline, of any user of Facebook.

The court also examined the following language from the Terms of Service:

We try to keep Facebook up, bug-free, and safe, but you use it at your own risk. We are providing Facebook as is without any express or implied warranties including, but not limited to, implied warranties of merchantability, fitness for a particular purpose, and non-infringement. We do not guarantee that Facebook will always be safe, secure or error-free or that Facebook will always function without disruptions, delays or imperfections. Facebook is not responsible for the actions, content, information, or data of third parties, and you release us, our directors, officers, employees, and agents from any claims and damages, known and unknown, arising out of or in any way connected with any claims you have against any such third parties.

The court found that by looking to the Terms of Service to support his claims against Facebook, Plaintiff could not likewise disavow those portions of the Terms of Service which did not support his case. Because the Terms of Service said, among other things, that Facebook was not responsible for the content of what its users post, and that the a user uses the service as his or her on risk, the court could not place the responsibility onto Facebook for the offensive content.

Moreover, the court held that the Communications Decency Act shielded Facebook from liability. The CDA immunizes providers of interactive computer services against liability arising from content created by third parties. The court found that Facebook was an interactive computer service as contemplated under the CDA, the information for which Plaintiff sought to hold Facebook liable was information provided by another information content provider, and the complaint sought to hold Facebook as the publisher or speaker of that information.

Caraccioli v. Facebook, 2016 WL 859863 (N.D. Cal., March 7, 2016)

About the Author: Evan Brown is a Chicago attorney advising enterprises on important aspects of technology law, including software development, technology and content licensing, and general privacy issues.

Communications Decency Act shields Backpage from liability for violation of federal sex trafficking law

backpage

Three Jane Doe plaintiffs, who alleged they were victims of sex trafficking, filed suit against online classified ad provider Backpage.com (“Backpage”), asserting that Backpage violated the federal Trafficking Victims Protection Reauthorization Act (“TVPRA”) by structuring its website to facilitate sex trafficking and implementing rules and processes designed to actually encourage sex trafficking.

The district court dismissed the TVPRA claims for failure to state a claim, holding that the Communications Decency Act, at 47 U.S.C. §230, provided immunity from the claims. Plaintiffs sought review with the First Circuit. On appeal, the court affirmed the lower court’s dismissal.

Section 230 principally shields website operators from being “treated as the publisher or speaker” of material posted by users of the site. In this case, the court held that plaintiffs’ claims were barred because the TVPRA claims “necessarily require[d] that the defendant be treated as the publisher or speaker of content provide by another.” Since the plaintiffs were trafficked by means of the third party advertisements on Backpage, there was no harm to them but for the content of the postings.

The court rejected plaintiffs’ attempts to characterize Backpage’s actions as “an affirmative course of conduct” distinct from the exercise of the “traditional publishing or editorial functions” of a website owner. The choice of what words or phrases to be displayed on the site, the decision not to reduce misinformation by changing its policies, and the decisions in structuring its website and posting requirements, in the court’s view, were traditional publisher functions entitled to Section 230 protection.

Does v. Backpage.com, LLC, No. 15-1724 (1st Cir., March 14, 2016)

Evan Brown is a Chicago attorney advising enterprises on important aspects of technology law, including software development, technology and content licensing, and general privacy issues.

See also:
Seventh Circuit sides with Backpage in free speech suit against sheriff

Website operator was too involved with development of content to be immune under Section 230

Defendant started up a website to — in her own words — provide a place for others to have a dialogue and post information about their experiences at Plaintiff’s youth drug rehab facilities. Plaintiff found the content of Defendant’s website offensive, and sued for defamation and intentional interference with prospective economic advantage. Defendant filed a motion to strike under California’s Anti-SLAPP law. The court denied the motion.

In denying the Anti-SLAPP motion, the court found, among other things, that Plaintiff had established a probability of prevailing on most of its claims. This chance of prevailing withstood Defendant’s argument that she was shielded from liability by the Communications Decency Act.

This Act provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1).

Defendant acknowledged that her defense was relevant only to the extent that she was alleging that comments by third parties on her website were defamatory.

She quoted Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2008) to assert that “the exclusion of ‘publisher’ liability necessarily precludes liability for exercising the usual prerogative of publishers to choose among proffered material and to edit the material published while retaining its basic form and message.” She argued that she was entitled to Section 230 immunity because she was an exempt publisher — she either simply posted others’ statements or made minor edits to those statements before posting.

The court did not agree with Defendant’s characterization of her publishing activities.

It found that her posts would not lead a visitor to believe that she was quoting third parties. Rather, in the court’s view, Defendant adopted the statements of others and used them to create her comments on the website. She posted her own articles, and summarized the statements of others.

Moreover, Defendant did more than simply post whatever information third parties provided. She elicited statements through two surveys that contained specific questions to gather information about specific issues. The court found this to disqualify Defendant from Section 230 immunity under the holding of Fair Housing Council v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) (wherein the website operator was not immune under the Communications Decency Act because it created discriminatory questions and choice of answers).

Diamond Ranch Academy, Inc. v. Filer, 2016 WL 633351 (D. Utah, February 17, 2016)

Evan Brown is a Chicago attorney advising enterprises on important aspects of technology law, including software development, technology and content licensing, and general privacy issues.

Newspaper not liable for alleged defamatory letter to editor published online

The Appellate Court of Illinois has sided in favor of a local newspaper in a defamation lawsuit brought against the paper over a reader’s allegedly defamatory letter to the editor. The court held that the Communciations Decency Act (at 47 U.S.C. 230) “absolved” the newspaper of liability over this appearance of third party content on the newspaper’s website.

Plaintiff — a lawyer and self-identified civil rights advocate — sent several letters to local businesses claiming those businesses did not have enough handicapped parking spaces. Instead of merely asking the businesses to create those parking spaces, he demanded each one pay him $5,000 or face a lawsuit.

One local resident thought plaintiff’s demands were greedy and extortionate, and wrote a letter to the editor of the local newspaper covering the story. The newspaper posted the letter online. Both the newspaper and the letter’s author found themselves as defendants in plaintiff’s defamation lawsuit.

The letter-writer settled with plaintiff, but the newspaper stayed in as a defendant and moved to dismiss, arguing that federal law immunized it from liability for content provided by the third party letter-writer.

The lower court dismissed the defamation claim against the newspaper, holding that the Communications Decency Act (at 47 U.S.C. §230) protected the newspaper from liability for the third party letter-writer’s comments posted on the newspaper’s website.

Plaintiff sought review with the Appellate Court of Illinois. On appeal, the court affirmed the dismissal.

The Communications Decency Act (at 47 U.S.C §230(c)(1)) says that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The appellate court found that the leter-writer was another information content provider that placed comments on the newspaper’s website. Therefore, it held that the Communications Decency Act “absolved” the newspaper from responsibility.

Straw v. Streamwood Chamber of Commerce, 2015 IL App (1st) 143094-U (September 29, 2015)

Evan Brown is an attorney in Chicago helping clients manage issues involving technology and new media.

Washington Supreme Court keeps victims’ lawsuit against Backpage.com moving forward

Plaintiffs – three minor girls – alleged that they were subjected to multiple instances of rape by adults who contacted them through advertisements posted on Backpage.com. Plaintiffs sued the website and its owner alleging various claims.

Defendants moved to dismiss the claims, arguing that Section 230 (47 U.S.C. § 230) shielded the website from liability arising from content posted by the site’s users. The lower court denied the motion to dismiss, finding that the site’s involvement went beyond passive hosting. Plaintiffs had claimed the website’s advertisement posting rules were intentionally designed to aid in evading law enforcement scrutiny, thereby facilitating the illegal trafficking and exploitation of minors. Defendants sought review with the court of appeals, which certified the question to the Washington state Supreme Court.

The Supreme Court affirmed the denial of the motion to dismiss. The court emphasized the necessity for further investigation into the website’s practices to determine the extent of its involvement in the alleged illegal activities. It found that plaintiffs’ allegations suggested that Backpage had specific content requirements and posting rules that, while outwardly appearing to comply with legal standards by prohibiting explicit content, were allegedly crafted in such a manner as to facilitate the concealment of illegal activities, including the sexual exploitation of minors.

J.S. v. Village Voice Media Holdings, LLC, 359 P.3d 714 (Wash., September 3, 2015)

Scroll to top