YouTube not liable for aiding ISIS in Paris attack

The Communications Decency Act provided immunity to Google in a suit brought against it by the family of an American college student killed in the November 2015 attack.

Plaintiffs filed suit against Google (as operator of YouTube) alleging violation of federal laws that prohibit providing material support to terrorists, arising from the November 2015 Paris attack that ISIS carried out. Plaintiffs argued that the YouTube platform, among other things, aided in recruitment and provided ISIS the means to distribute messages about its activities.

Google moved to dismiss the lawsuit, arguing that Section 230 of the Communications Decency Act (47 U.S.C. 230) provided immunity from suit. The court granted the motion to dismiss.

Section 230 Generally

Section 230(c) provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Accordingly, Section 230 precludes liability that treats a website as the publisher or speaker of information users provide on the website, protecting websites from liability for material posted on the website by someone else.

JASTA Did Not Repeal Section 230 Immunity

In response to Google’s arguments in favor of Section 230 immunity, plaintiffs first argued that a recent federal statute – the Justice Against Sponsors of Terrorism Act, or “JASTA” – effectively repealed the immunity conferred to interactive computer services by Section 230. Plaintiffs focused on language in the statute that stated that its purpose “is to provide civil litigants with the broadest possible basis, consistent with the Constitution of the United States, to seek relief” against terrorists and those who assist them.

The court rejected plaintiffs’ arguments that JASTA repealed Section 230 immunity. Significantly, the statute did not expressly repeal Section 230’s protections, nor did it do so implicitly by evincing any “clear and manifest” congressional intent to repeal any part of the Communications Decency Act.

Section 230 Need Not Be Applied Outside the United States

Plaintiffs also argued that Section 230 immunity did not arise because the Communications Decency Act should not apply outside the territorial jurisdiction of the United States. According to plaintiffs, Google provided support and resources to ISIS outside the United States (in Europe and the Middle East), ISIS’s use of Google’s resources was outside the United States, and the Paris attacks and plaintiffs’ relative’s death took place outside the United States.

The court rejected this argument, holding that Section 230’s focus is on limiting liability. The application of the statute to achieve that objective must occur where the immunity is needed, namely, at the place of litigation. Since the potential for liability, and the application of immunity was occurring in the United States, there was no need to apply Section 230 “extraterritorially”.

Immunity Protected Google

Google argued that plaintiffs’ claims sought to treat it as the publisher or speaker of the offending ISIS content, thus satisfying one of the requirements for Section 230 immunity. Plaintiffs countered that their lawsuit did not depend on the characterization of Google as the publisher or speaker of ISIS’s content, because their claims focused on Google’s violations of the federal criminal statutes that bar the provision of material support to terrorists.

But the court found that the conduct Google was accused of — among other things, failing to ensure that ISIS members who had been kicked off could not re-establish accounts — fit within the traditional editorial functions of a website operator. Accordingly, despite the plaintiffs’ characterization of its claims, the court found such claims to be an attempt to treat Google as the publisher or speaker of the ISIS videos.

The court similarly rejected plaintiffs’ arguments that Section 230 immunity should not apply because, by appending advertisements to some of the ISIS videos, Google became an “information content provider” itself, and thus responsible for the videos. This argument failed primarily because the content of the advertisements (which themselves were provided by third parties) did not contribute to the unlawfulness of the content of the videos.

Gonzalez v. Google, Inc., — F.Supp.3d —, 2017 WL 4773366 (N.D. Cal., October 23, 2017)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Google and YouTube protected by Section 230

The case of Weerahandi v. Shelesh is a classic example of how Section 230 (a provision of the Communications Decency Act (CDA), found at 47 USC 230) shielded online intermediaries from alleged tort liability occasioned by their users.

Background Facts

Plaintiff was a YouTuber and filed a pro se lawsuit for, among other things, defamation, against a number of other YouTubers as well as Google and YouTube. The allegations arose from a situation back in 2013 in which one of the individual defendants sent what plaintiff believed to be a “false and malicious” DMCA takedown notice to YouTube. One of the defendants later took the contact information plaintiff had to provide in the counter-notification and allegedly disseminated that information to others who were alleged to have published additional defamatory YouTube videos.

Google and YouTube also got named as defendants for “failure to remove the videos” and for not taking “corrective action”. These parties moved to dismiss the complaint, claiming immunity under Section 230. The court granted the motion to dismiss.

Section 230’s Protections

Section 230 provides, in pertinent part that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). Section 230 also provides that “[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.” 47 U.S.C. § 230(e)(3).

The CDA also “proscribes liability in situations where an interactive service provider makes decisions ‘relating to the monitoring, screening, and deletion of content from its network.’ ” Obado v. Magedson, 612 Fed.Appx. 90, 94–95 (3d Cir. 2015). Courts have recognized Congress conferred broad immunity upon internet companies by enacting the CDA, because the breadth of the internet precludes such companies from policing content as traditional media have. See Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d 398, 407 (6th Cir. 2014); Batzel v Smith, 333 F.3d 1018, 1026 (9th Cir. 2003); Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th 1997); DiMeo v. Max, 433 F. Supp. 2d 523, 528 (E.D. Pa. 2006).

How Section 230 Applied Here

In this case, the court found that the CDA barred plaintiff’s claims against Google and YouTube. Both Google and YouTube were considered “interactive computer service[s].” Parker v. Google, Inc., 422 F. Supp. 2d 492, 551 (E.D. Pa. 2006). Plaintiff did not allege that Google or YouTube played any role in producing the allegedly defamatory content. Instead, Plaintiff alleged both websites failed to remove the defamatory content, despite his repeated requests.

Plaintiff did not cite any authority in his opposition to Google and YouTube’s motion, and instead argued that the CDA did not bar claims for the “failure to remove the videos” or to “take corrective action.” The court held that to the contrary, the CDA expressly protected internet companies from such liability. Under the CDA, plaintiff could not assert a claim against Google or YouTube for decisions “relating to the monitoring, screening, and deletion of content from its network. ” Obado, 612 Fed.Appx. at 94–95 (3d Cir. 2015); 47 U.S.C. § 230(c)(1) (“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”). For these reasons, the court found the CDA barred plaintiff’s claims against Google and YouTube.

Weerahandi v. Shelesh, 2017 WL 4330365 (D.N.J. September 29, 2017)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Quora gets Section 230 victory in the Tenth Circuit

Pro se plaintiff Silver filed suit in federal court in New Mexico against the online question-and-answer website Quora, alleging that statements made by two different individuals concerning his professional services were defamatory. Quora moved to dismiss, arguing that the immunity provisions of the Communications Decency Act, at 47 U.S.C. 230 shielded it from liability arising from content posted by its users. The district court granted the motion to dismiss. Plaintiff sought review with the Tenth Circuit Court of Appeals. On review, the court affirmed the lower court’s dismissal of the case.

Citing to its previous Section 230 precedent, Ben Ezra, Weinstein, & Co. v. Am. Online Inc., 206 F.3d 980 (10th Cir. 2000), the court held that Quora was a provider of “an interactive computer service,” that its actions forming the basis of alleged liability, namely, in hosting the content, were that of a “publisher or speaker,” and that the content giving rise to the alleged liability was from “another information content provider,” i.e., the users who posted the content.

Silver v. Quora, Inc., 2016 WL 6892146 (10th Circuit, November 23, 2016)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Yelp not liable for allegedly defamatory customer reviews

In a recent case having an outcome that should surprise no one, the United States Court of Appeals for the Ninth Circuit has affirmed a lower court’s decision that held Yelp immune from liability under the Communications Decency Act (47 U.S.C. 230 – the “CDA”) over customer reviews that were allegedly defamatory.

Plaintiff sued Yelp for violations under RICO and the Washington Consumer Protection Act, as well as libel under Washington law. Yelp moved to dismiss for failure to state to claim upon which relief may be granted. The lower court found that plaintiff had failed to allege any facts that plausibly suggested Yelp was responsible for the content, and therefore dismissed the case. Plaintiffs sought review with the Ninth Circuit. On appeal, the court affirmed.

The appellate court observed that plaintiff’s complaint, which he filed pro se, “pushed the envelope” of creative pleading. The court observed that plaintiff cryptically – “to the point of opacity” – alleged that Yelp was the one that created and developed the offending content. The court declined to open the door to such “artful skirting” of the Communications Decency Act’s safe harbor provision.

The key question before the court was whether the alleged defamatory reviews were provided by Yelp or by another information content provider. CDA immunity does not extend to situations where the web site itself is responsible for the creation or development of the offending content. The immunity protects providers or users of interactive computer services when the claims being made against them seek to treat them as a publisher or speaker of the information provided by another information content provider.

In this case, the court found that a careful reading of plaintiff’s complaint revealed that he never specifically alleged that Yelp created the content of the allegedly defamatory posts. Rather, plaintiff pled that Yelp adopted them from another website and transformed them into its own stylized promotions. The court found that these “threadbare” allegations of Yelp’s fabrication of allegedly defamatory statements were implausible on their face and were insufficient to avoid immunity under the Communications Decency Act. The court was careful to note that CDA immunity does not extend to content created or developed by an interactive computer service. “But the immunity in the CDA is broad enough to require plaintiffs alleging such a theory to state the facts plausibly suggesting the defendant fabricated content under a third party’s identity.”

The plaintiff had alleged in part that Yelp’s rating system and its use by the author of the allegedly defamatory content resulted in the creation or development of information by Yelp. The court rejected this argument, finding that the rating system did “absolutely nothing to enhance the defamatory sting of the message beyond the words offered by the user.” The court further observed that the star rating system was best characterized as a neutral tool operating on voluntary inputs that did not amount to content development or creation.

Finally, the court addressed plaintiff’s cryptic allegations that Yelp should be held liable for republishing the alleged defamatory content as advertisements or promotions on Google. A footnote in the opinion states that plaintiff was not clear whether the alleged republication was anything more than the passive indexing of Yelp reviews by the Google crawler. The decision’s final outcome, however, does not appear to depend on whether Google indexed that content as Yelp passively stood by or whether Yelp affirmatively directed the content to Google. “Nothing in the text of the CDA indicates that immunity turns on how many times an interactive computer service publishes information provided by another information content provider.” In the same way that Yelp would not be liable for posting user generated content on its web site, it would not be liable for disseminating the same content in essentially the same format to a search engine. “Simply put, proliferation and dissemination of content does not equal creation or development of content.”

Kimzey v. Yelp! Inc., — F.3d —, 2016 WL 4729492 (9th Cir. September 12, 2016)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Communications Decency Act shields Backpage from liability for violation of federal sex trafficking law

backpage

Three Jane Doe plaintiffs, who alleged they were victims of sex trafficking, filed suit against online classified ad provider Backpage.com (“Backpage”), asserting that Backpage violated the federal Trafficking Victims Protection Reauthorization Act (“TVPRA”) by structuring its website to facilitate sex trafficking and implementing rules and processes designed to actually encourage sex trafficking.

The district court dismissed the TVPRA claims for failure to state a claim, holding that the Communications Decency Act, at 47 U.S.C. §230, provided immunity from the claims. Plaintiffs sought review with the First Circuit. On appeal, the court affirmed the lower court’s dismissal.

Section 230 principally shields website operators from being “treated as the publisher or speaker” of material posted by users of the site. In this case, the court held that plaintiffs’ claims were barred because the TVPRA claims “necessarily require[d] that the defendant be treated as the publisher or speaker of content provide by another.” Since the plaintiffs were trafficked by means of the third party advertisements on Backpage, there was no harm to them but for the content of the postings.

The court rejected plaintiffs’ attempts to characterize Backpage’s actions as “an affirmative course of conduct” distinct from the exercise of the “traditional publishing or editorial functions” of a website owner. The choice of what words or phrases to be displayed on the site, the decision not to reduce misinformation by changing its policies, and the decisions in structuring its website and posting requirements, in the court’s view, were traditional publisher functions entitled to Section 230 protection.

Does v. Backpage.com, LLC, No. 15-1724 (1st Cir., March 14, 2016)

Evan Brown is a Chicago attorney advising enterprises on important aspects of technology law, including software development, technology and content licensing, and general privacy issues.

See also:
Seventh Circuit sides with Backpage in free speech suit against sheriff

Website operator was too involved with development of content to be immune under Section 230

Defendant started up a website to — in her own words — provide a place for others to have a dialogue and post information about their experiences at Plaintiff’s youth drug rehab facilities. Plaintiff found the content of Defendant’s website offensive, and sued for defamation and intentional interference with prospective economic advantage. Defendant filed a motion to strike under California’s Anti-SLAPP law. The court denied the motion.

In denying the Anti-SLAPP motion, the court found, among other things, that Plaintiff had established a probability of prevailing on most of its claims. This chance of prevailing withstood Defendant’s argument that she was shielded from liability by the Communications Decency Act.

This Act provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1).

Defendant acknowledged that her defense was relevant only to the extent that she was alleging that comments by third parties on her website were defamatory.

She quoted Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2008) to assert that “the exclusion of ‘publisher’ liability necessarily precludes liability for exercising the usual prerogative of publishers to choose among proffered material and to edit the material published while retaining its basic form and message.” She argued that she was entitled to Section 230 immunity because she was an exempt publisher — she either simply posted others’ statements or made minor edits to those statements before posting.

The court did not agree with Defendant’s characterization of her publishing activities.

It found that her posts would not lead a visitor to believe that she was quoting third parties. Rather, in the court’s view, Defendant adopted the statements of others and used them to create her comments on the website. She posted her own articles, and summarized the statements of others.

Moreover, Defendant did more than simply post whatever information third parties provided. She elicited statements through two surveys that contained specific questions to gather information about specific issues. The court found this to disqualify Defendant from Section 230 immunity under the holding of Fair Housing Council v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) (wherein the website operator was not immune under the Communications Decency Act because it created discriminatory questions and choice of answers).

Diamond Ranch Academy, Inc. v. Filer, 2016 WL 633351 (D. Utah, February 17, 2016)

Evan Brown is a Chicago attorney advising enterprises on important aspects of technology law, including software development, technology and content licensing, and general privacy issues.

Newspaper not liable for alleged defamatory letter to editor published online

The Appellate Court of Illinois has sided in favor of a local newspaper in a defamation lawsuit brought against the paper over a reader’s allegedly defamatory letter to the editor. The court held that the Communciations Decency Act (at 47 U.S.C. 230) “absolved” the newspaper of liability over this appearance of third party content on the newspaper’s website.

Plaintiff — a lawyer and self-identified civil rights advocate — sent several letters to local businesses claiming those businesses did not have enough handicapped parking spaces. Instead of merely asking the businesses to create those parking spaces, he demanded each one pay him $5,000 or face a lawsuit.

One local resident thought plaintiff’s demands were greedy and extortionate, and wrote a letter to the editor of the local newspaper covering the story. The newspaper posted the letter online. Both the newspaper and the letter’s author found themselves as defendants in plaintiff’s defamation lawsuit.

The letter-writer settled with plaintiff, but the newspaper stayed in as a defendant and moved to dismiss, arguing that federal law immunized it from liability for content provided by the third party letter-writer.

The lower court dismissed the defamation claim against the newspaper, holding that the Communications Decency Act (at 47 U.S.C. §230) protected the newspaper from liability for the third party letter-writer’s comments posted on the newspaper’s website.

Plaintiff sought review with the Appellate Court of Illinois. On appeal, the court affirmed the dismissal.

The Communications Decency Act (at 47 U.S.C §230(c)(1)) says that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The appellate court found that the leter-writer was another information content provider that placed comments on the newspaper’s website. Therefore, it held that the Communications Decency Act “absolved” the newspaper from responsibility.

Straw v. Streamwood Chamber of Commerce, 2015 IL App (1st) 143094-U (September 29, 2015)

Evan Brown is an attorney in Chicago helping clients manage issues involving technology and new media.

Washington Supreme Court keeps victims’ lawsuit against Backpage.com moving forward

Plaintiffs – three minor girls – alleged that they were subjected to multiple instances of rape by adults who contacted them through advertisements posted on Backpage.com. Plaintiffs sued the website and its owner alleging various claims.

Defendants moved to dismiss the claims, arguing that Section 230 (47 U.S.C. § 230) shielded the website from liability arising from content posted by the site’s users. The lower court denied the motion to dismiss, finding that the site’s involvement went beyond passive hosting. Plaintiffs had claimed the website’s advertisement posting rules were intentionally designed to aid in evading law enforcement scrutiny, thereby facilitating the illegal trafficking and exploitation of minors. Defendants sought review with the court of appeals, which certified the question to the Washington state Supreme Court.

The Supreme Court affirmed the denial of the motion to dismiss. The court emphasized the necessity for further investigation into the website’s practices to determine the extent of its involvement in the alleged illegal activities. It found that plaintiffs’ allegations suggested that Backpage had specific content requirements and posting rules that, while outwardly appearing to comply with legal standards by prohibiting explicit content, were allegedly crafted in such a manner as to facilitate the concealment of illegal activities, including the sexual exploitation of minors.

J.S. v. Village Voice Media Holdings, LLC, 359 P.3d 714 (Wash., September 3, 2015)

Third Circuit upholds Communications Decency Act immunity for Google, Yahoo and others

Plaintiff filed suit against Google, Yahoo and some unknown (John Doe) defendants for defamation, tortious interference with contract, and negligent and intentional infliction of emotional distress based on various online postings. The district court dismissed the complaint, holding that the Communications Decency Act (47 U.S.C. §230) provided immunity to defendants over the third party content giving rise to the complaint. Section 230 provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Because defendants were not the creators of the information, and the claims attempted to treat them as the publisher or speaker of that content, Section 230 barred the claims.

Kabbaj v. Google, Inc., 2015 WL 534864 (3rd Cir. Feb. 10, 2015)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

When is it okay to use social media to make fun of people?

There is news from California that discusses a Facebook page called 530 Fatties that was created to collect photos of and poke fun at obese people. It’s a rude project, and sets the context for discussing some intriguing legal and normative issues.

Apparently the site collects photos that are taken in public. One generally doesn’t have a privacy interest in being photographed while in public places. And that seems pretty straightforward if you stop and think about it — you’re in public after all. But should technology change that legal analysis? Mobile devices with good cameras connected to high speed broadband networks make creation, sharing and shaming much easier than it used to be. A population equipped with these means essentially turns all public space into a panopticon. Does that mean the individual should be given more of something-like-privacy when in public? If you think that’s crazy, consider it in light of what Justice Sotomayor wrote in her concurrence in the 2012 case of U.S. v. Jones: “I would ask whether people reasonably expect that their movements will be recorded and aggregated in a manner that enables [one] to ascertain, more or less at will, their political and religious beliefs, sexual habits, and so on.”

Apart from privacy harms, what else is at play here? For the same reasons that mobile cameras + social media jeopardizes traditional privacy assurances, the combination can magnify the emotional harms against a person. The public shaming that modern technology occasions can inflict deeper wounds because of the greater spatial and temporal characteristics of the medium. One can now easily distribute a photo or other content to countless individuals, and since the web means the end of forgetting, that content may be around for much longer than the typical human memory.

Against these concerns are the free speech interests of the speaking parties. In the U.S. especially, it’s hardwired into our sensibilities that each of us has great freedom to speak and otherwise express ourselves. The traditional First Amendment analysis will protect speech — even if it offends — unless there is something truly unlawful about it. For example, there is no free speech right to defame, to distribute obscene materials, or to use “fighting words.” Certain forms of harassment fall into the category of unprotected speech. How should we examine the role that technology plays in moving what would otherwise be playground-like bullying (like calling someone a fatty) to unlawful speech that can subject one to civil or even criminal liability? Is the impact that technology’s use makes even a valid issue to discuss?

Finally, we should examine the responsibility of the intermediaries here. A social media platform generally is going to be protected by the Communications Decency Act at 47 USC 230 from liability for third party content. But we should discuss the roles of the intermediary in terms other than pure legal ones. Many social media platforms are proactive in taking down otherwise lawful content that has the tendency to offend. The pervasiveness of social media underscores the power that these platforms have to shape normative values around what is appropriate behavior among individuals. This power is indeed potentially greater than any legal or governmental power to constrain the generation and distribution of content.

Evan Brown is an attorney in Chicago advising clients on matters dealing with technology, the internet and new media.

Scroll to top