Executive order to clarify Section 230: a summary

Section 230 executive order

Late yesterday President Trump took steps to make good on his promise to regulate online platforms like Twitter and Facebook. He released a draft executive order to that end. You can read the actual draft executive order. Here is a summary of the key points. The draft order:

  • States that it is the policy of the U.S. to foster clear, nondiscriminatory ground rules promoting free and open debate on the Internet. It is the policy of the U.S. that the scope of Section 230 immunity should be clarified.
  • Argues that a platform becomes a “publisher or speaker” of content, and therefore not subject to Section 230 immunity, when it does not act in good faith to to restrict access to content (in accordance with Section 230(c)(2) that it considers to be “obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable.” The executive order argues that Section 230 “does not extend to deceptive or pretextual actions restricting online content or actions inconsistent with an online platform’s terms of service.”
  • Orders the Secretary of Commerce to petition the FCC, requesting that the FCC propose regulations to clarify the conditions around a platform’s “good faith” when restricting access or availability of content. In particuar, the requested rules would examine whether the action was, among other things, deceptive, pretextual, inconsistent with the provider’s terms of service, the product of unreasoned explanation, or without meaningful opportunity to be heard.
  • Directs each federal executive department and agency to review its advertising and marketing spending on online platforms. Each is to provide a report in 30 days on: amount spent, which platforms supported, any viewpoint-based restrictions of the platform, assessment whether the platform is appropriate, and statutory authority available to restrict advertising on platforms not deemed appropriate.
  • States that it is the policy of the U.S. that “large social media platforms, such as Twitter and Facebook, as the functional equivalent of a traditional public forum, should not infringe on protected speech”.
  • Re-establishes the White House “Tech Bias Reporting Tool” that allows Americans to report incidents of online censorship. These complaints are to be forwarded to the DoJ and the FTC.
  • Directs the FTC to “consider” taking action against entities covered by Section 230 who restrict speech in ways that do not align with those entities’ public representations about those practices.
  • Directs the FTC to develop a publicly-available report describing complaints of activity of Twitter and other “large internet platforms” that may violate the law in ways that implicate the policy that these are public fora and should not infringe on protected speech.
  • Establishes a working group with states’ attorneys general regarding enforcement of state statutes prohibiting online platforms from engaging in unfair and deceptive acts and practices. 
  • This working group is also to collect publicly available information for the creation and monitoring of user watch lists, based on their interactions with content and other users (likes, follows, time spent). This working group is also to monitor users based on their activity “off the platform”. (It is not clear whether that means “off the internet” or “on other online places”.)

Malware detection software provider gets important victory allowing it to flag unwanted driver installer

Despite a recent Ninth Circuit decision denying immunity to malware detection provider for targeting competitor’s software, court holds that Section 230 protected Malwarebytes from liability for designating software driver program as potentially unwanted program.

Plaintiff provided software that works in real time in the background of the operating system to optimize processing and locate and install missing and outdated software drivers. Defendant provided malware detection software designed to scan consumer’s computers and to report potentially unwanted programs. After defendant’s software categorized plaintiff’s sofware as a potentially unwanted program, plaintiff sued, putting forth a number of business torts, including business disparagement, tortious interference and common law unfair competition.

Defendant moved to dismiss under 47 U.S.C. 230(c)(2)(B), which provides that no provider of an interactive computer service shall be held liable on account of any action taken to enable or make available others the technical means to restrict access to material that the provider deems to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.

The court granted the motion to dismiss, holding that Malwarebytes was immune from suit under Section 230. It differentiated the case from the Ninth Circuit’s recent decision in Enigma Software Group USA, LLC v. Malwarebytes, Inc., 946 F.3d 1040 (9th Cir. 2019), in which the court held that Section 230 immunity did not protect Malwarebytes for designating a competitor’s anti-malware software as “otherwise objectionable”. In this case, the court found that plaintiff’s software did not make it a competitor to defendant. Since the parties were not direct competitors, the limitations on Section 230’s protection did not apply.

The case can be met with a bit of a sigh of relief to those who, along with Professor Goldman expressed concern that the Enigma case would make it more difficult for anti-malware providers to offer their services. Though Enigma did limit Section 230 protection for these vendors, this decision shows that Section 230 immunity in this space is not dead.  

Asurvio LP v. Malwarebytes, Inc., 2020 WL 1478345 (N.D. Cal. March 26, 2020)

See also

Best practices for providers of goods and services on the Internet of Things

Ninth Circuit: Section 230 barred tortious interference claim

Amazon.com scored a Ninth Circuit win on Section 230 grounds when the court affirmed the lower court’s summary judgment against a pro se plaintiff’s claim against Amazon for tortious interference with prospective and actual business relations, and interference with an economic advantage. The claim apparently arose out of a third party posting a review on Amazon that plaintiff did not like. Citing to Barnes v. Yahoo!, Inc., 570 F.3d 1096 (9th Cir. 2009), the court observed that the Communications Decency Act (at Section 230(c)(1)) provides immunity from liability if a claim “inherently requires the court to treat the defendant as the ‘publisher or speaker’ of content provided by another.” Plaintiff had failed to raise a genuine dispute of material fact as to whether Amazon was not a publisher or speaker of content within the meaning of Section 230.

Sen v. Amazon.com, 2020 WL 708701 (9th Cir. February 12, 2020)

Intellectual property exception to CDA 230 immunity did not apply in case against Google

Plaintiff sued Google for false advertising and violations of New Jersey’s Consumer Fraud Act over Google’s provision of Adwords services for other defendants’ website, which plaintiff claimed sold counterfeit versions of plaintiff’s products. Google moved to dismiss these two claims and the court granted the motion. 

On the false advertising issue, the court held that plaintiff had failed to allege the critical element that Google was the party that made the alleged misrepresentations concerning the counterfeit products. 

As for the Consumer Fraud Act claim, the court held that Google enjoyed immunity from such a claim in accordance with the Communications Decency Act at 47 U.S.C. 230(c). 

Specifically, the court found: (1) Google’s services made Google the provider of an interactive computer service, (2) the claim sought to hold Google liable for the publishing of the offending ads, and (3) the offending ads were published by a party other than Google, namely, the purveyor of the allegedly counterfeit goods. CDA immunity applied because all three of these elements were met. 

The court rejected plaintiff’s argument that the New Jersey Consumer Fraud Act was an intellectual property statute and that therefore under Section 230(e)(2), CDA immunity did not apply. With immunity present, the court dismissed the consumer fraud claim. 

InvenTel Products, LLC v. Li, No. 19-9190, 2019 WL 5078807 (D.N.J. October 10, 2019)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown@internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Facebook scores Section 230 win over claims brought by Russian page accused of meddling in 2016 U.S. presidential election

Plaintiffs owned and operated a Facebook page that Facebook shut down in 2018 because of concerns the page was associated with Russian interference with the 2016 U.S. presidential election. After getting shut down, plaintiffs sued Facebook alleging a number of claims, including:

  • damages under 42 U.S.C. §1983 for deprivation of a constitutional right by one acting under color of state law;
  • civil rights violations under California law;
  • breach of contract; and
  • breach of implied covenant of good faith and fair dealing.

Facebook moved to dismiss these claims under the Communications Decency Act at 47 U.S.C. §230. The court granted the motion to dismiss.

Section 230 immunizes defendants from liability if:

  • defendant is a provider or user of an interactive computer service;
  • the information for which plaintiff seeks to hold defendant liable is information provided by another information content provider; and
  • plaintiff’s claim seeks to hold defendant liable as the publisher or speaker of that information.

In this case, there was no dispute Facebook met the first two elements, i.e., it is a provider of an interactive computer service and the information (namely, the content of plaintiffs’ page) was provided by a party other than Facebook. The real dispute came under the third element.

Plaintiffs argued that Section 230 should not immunize Facebook because this case did not concern obscenity or any other form of unprotected speech. Instead, plaintiffs argued, the case concerned political speech that strikes at the heart of the First Amendment. The court rejected this argument, holding that immunity under the Communications Decency Act does not contain a political speech exception. The statutory text provides that no “provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. (Emphasis added). No distinction is made between political speech and non-political speech.

Plaintiff also argued that granting Facebook immunity would be counter to congressional intent behind the Communications Decency Act. But the court borrowed language from Barnes v. Yahoo!, Inc., 570 F.3d 1096 (9th Cir. 2009) on this point: “Both parties make a lot of sound and fury on the congressional intent of the immunity under section 230, but such noise ultimately signifies nothing. It is the language of the statute that defines and enacts the concerns and aims of Congress; a particular concern does not rewrite the language.” Looking to Fair Hous. Council of San Fernando Valley v. Roommates.Com, LLC, 521 F.3d 1157 (9th Cir. 2008), the court noted that Ninth Circuit case law interpreting the language of the Communications Decency Act has held that “activity that can be boiled down to deciding whether to exclude material that third parties seek to post online is perforce immune under section 230.”

Federal Agency of News LLC v. Facebook, Inc., 2019 WL 3254208 (N.D. Cal. July 20, 2019)

Section 230 protected Twitter from liability for deleting Senate candidate’s accounts

Plaintiff (an Arizona senate candidate) sued Twitter after it suspended four of plaintiff’s accounts. He brought claims for (1) violation of the First Amendment; (2) violation of federal election law; (3) breach of contract; (4) conversion, (5) antitrust; (6) negligent infliction of emotional distress; (7) tortious interference; and (8) promissory estoppel.

Twitter moved to dismiss on multiple grounds, including that Section 230(c)(1) of the Communications Decency Act (“CDA”), 47 U.S.C. § 230, rendered it immune from liability for each of plaintiff’s claims that sought to treat it as a publisher of third-party content.

The CDA protects from liability (1) any provider of an interactive computer service (2) whom a plaintiff seeks to treat as a publisher or speaker (3) of information provided by another information content provider.

The court granted the motion to dismiss, on Section 230 grounds, all of the claims except the antitrust claim (which it dismissed for other reasons). It held that Twitter is a provider of an interactive computer service. And plaintiff sought to treat Twitter as a publisher or speaker by trying to pin liability on it for deleting accounts, which is a quintessential activity of a publisher. The deleted accounts were comprised of information provided by another information content provider (i.e., not Twitter, but plaintiff himself).

Brittain v. Twitter, 2019 WL 2423375 (N.D. Cal. June 10, 2019)

Section 230 protected Google in lawsuit over blog post

Defendant used Google’s Blogger service to write a post – about plaintiffs’ business practices – that plaintiffs found objectionable. So plaintiffs sued Google in federal court for defamation, tortious interference with a business relationship, and intentional infliction of emotional distress. The lower court dismissed the case on grounds that the Communications Decency Act (at 47 U.S.C. §230) immunized Google from liability for the publication of third party content.

Plaintiffs sought review with the U.S. Court of Appeals for the District of Columbia. On appeal, the court affirmed the dismissal. Applying a three part test the court developed in Klayman v. Zuckerberg, 753 F.3d 1354 (D.C. Cir. 2014) (which in turn applied analysis from the leading case of Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997)), the court held that Section 230 entitled Google to immunity because: (1) Google was a “provider or user of an interactive computer service,” (2) the relevant blog post contained “information provided by another information content provider,” and (3) the complaint sought to hold Google liable as “the publisher or speaker” of the blog post.

The court rejected defendant’s argument that in establishing and enforcing its Blogger Content Policy, Google influenced and thereby created the content it published. It held that Google’s role was strictly one of “output control” – because Google’s choice was limited to a “yes” or a “no” decision whether to remove the post, its action constituted “the very essence of publishing.” Since Section 230 immunizes online defendants against complaints seeking to hold them as the publisher of content, the lower court properly dismissed the action.

Bennett v. Google, LLC, 882 F.3d 1163 (D.C. Cir., February 23, 2018)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Google can, at least for now, disregard Canadian court order requiring deindexing worldwide

U.S. federal court issues preliminary injunction, holding that enforcement of Canadian order requiring Google to remove search results would run afoul of the Communications Decency Act (at 47 U.S.C. 230)

canada-us
Canadian company Equustek prevailed in litigation in Canada against rival Datalink on claims relating to trade secret misappropriation and unfair competition. After the litigation, Equustek asked Google to remove Datalink search results worldwide. Google initially refused altogether, but after a Canadian court entered an injunction against Datalink, Google removed Datalink results from google.ca. Then a Canadian court ordered Google to delist worldwide, and Google complied. Google objected to the order requiring worldwide delisting, and took the case all the way up to the Canadian Supreme Court, which affirmed the lower courts’ orders requiring worldwide delisting.

So Google filed suit in federal court in the United States, seeking a declaratory judgment that being required to abide by the Canadian order would, among other things, be contrary to the protections afforded to interactive computer service providers under the Communications Decency Act, at 47 U.S.C. 230.

The court entered the preliminary injunction (i.e., it found in favor of Google pending a final trial on the merits), holding that (1) Google would likely succeed on its claim under the Communications Decency Act, (2) it would suffer irreparable harm in the absence of preliminary relief, (3) the balance of equities weighed in its favor, and (4) an injunction was in the public interest.

Section 230 of the Communications Decency Act immunizes providers of interactive computer services against liability arising from content created by third parties. It states that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” [More info about Section 230]

The court found that there was no question Google is a “provider” of an “interactive computer service.” Also, it found that Datalink—not Google—“provided” the information at issue. And finally, it found that the Canadian order would hold Google liable as the “publisher or speaker” of the information on Datalink’s websites. So the Canadian order treated Google as a publisher, and would impose liability for failing to remove third-party content from its search results. For these reasons, Section 230 applied.

Summarizing the holding, the court observed that:

The Canadian order would eliminate Section 230 immunity for service providers that link to third-party websites. By forcing intermediaries to remove links to third-party material, the Canadian order undermines the policy goals of Section 230 and threatens free speech on the global internet.

The case provides key insight into the evolving legal issues around global enforcement and governance.

Google, Inc. v. Equustek Solutions, Inc., 2017 WL 5000834 (N.D. Cal. November 2, 2017)

YouTube not liable for aiding ISIS in Paris attack

The Communications Decency Act provided immunity to Google in a suit brought against it by the family of an American college student killed in the November 2015 attack.

Plaintiffs filed suit against Google (as operator of YouTube) alleging violation of federal laws that prohibit providing material support to terrorists, arising from the November 2015 Paris attack that ISIS carried out. Plaintiffs argued that the YouTube platform, among other things, aided in recruitment and provided ISIS the means to distribute messages about its activities.

Google moved to dismiss the lawsuit, arguing that Section 230 of the Communications Decency Act (47 U.S.C. 230) provided immunity from suit. The court granted the motion to dismiss.

Section 230 Generally

Section 230(c) provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Accordingly, Section 230 precludes liability that treats a website as the publisher or speaker of information users provide on the website, protecting websites from liability for material posted on the website by someone else.

JASTA Did Not Repeal Section 230 Immunity

In response to Google’s arguments in favor of Section 230 immunity, plaintiffs first argued that a recent federal statute – the Justice Against Sponsors of Terrorism Act, or “JASTA” – effectively repealed the immunity conferred to interactive computer services by Section 230. Plaintiffs focused on language in the statute that stated that its purpose “is to provide civil litigants with the broadest possible basis, consistent with the Constitution of the United States, to seek relief” against terrorists and those who assist them.

The court rejected plaintiffs’ arguments that JASTA repealed Section 230 immunity. Significantly, the statute did not expressly repeal Section 230’s protections, nor did it do so implicitly by evincing any “clear and manifest” congressional intent to repeal any part of the Communications Decency Act.

Section 230 Need Not Be Applied Outside the United States

Plaintiffs also argued that Section 230 immunity did not arise because the Communications Decency Act should not apply outside the territorial jurisdiction of the United States. According to plaintiffs, Google provided support and resources to ISIS outside the United States (in Europe and the Middle East), ISIS’s use of Google’s resources was outside the United States, and the Paris attacks and plaintiffs’ relative’s death took place outside the United States.

The court rejected this argument, holding that Section 230’s focus is on limiting liability. The application of the statute to achieve that objective must occur where the immunity is needed, namely, at the place of litigation. Since the potential for liability, and the application of immunity was occurring in the United States, there was no need to apply Section 230 “extraterritorially”.

Immunity Protected Google

Google argued that plaintiffs’ claims sought to treat it as the publisher or speaker of the offending ISIS content, thus satisfying one of the requirements for Section 230 immunity. Plaintiffs countered that their lawsuit did not depend on the characterization of Google as the publisher or speaker of ISIS’s content, because their claims focused on Google’s violations of the federal criminal statutes that bar the provision of material support to terrorists.

But the court found that the conduct Google was accused of — among other things, failing to ensure that ISIS members who had been kicked off could not re-establish accounts — fit within the traditional editorial functions of a website operator. Accordingly, despite the plaintiffs’ characterization of its claims, the court found such claims to be an attempt to treat Google as the publisher or speaker of the ISIS videos.

The court similarly rejected plaintiffs’ arguments that Section 230 immunity should not apply because, by appending advertisements to some of the ISIS videos, Google became an “information content provider” itself, and thus responsible for the videos. This argument failed primarily because the content of the advertisements (which themselves were provided by third parties) did not contribute to the unlawfulness of the content of the videos.

Gonzalez v. Google, Inc., — F.Supp.3d —, 2017 WL 4773366 (N.D. Cal., October 23, 2017)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Google and YouTube protected by Section 230

The case of Weerahandi v. Shelesh is a classic example of how Section 230 (a provision of the Communications Decency Act (CDA), found at 47 USC 230) shielded online intermediaries from alleged tort liability occasioned by their users.

Background Facts

Plaintiff was a YouTuber and filed a pro se lawsuit for, among other things, defamation, against a number of other YouTubers as well as Google and YouTube. The allegations arose from a situation back in 2013 in which one of the individual defendants sent what plaintiff believed to be a “false and malicious” DMCA takedown notice to YouTube. One of the defendants later took the contact information plaintiff had to provide in the counter-notification and allegedly disseminated that information to others who were alleged to have published additional defamatory YouTube videos.

Google and YouTube also got named as defendants for “failure to remove the videos” and for not taking “corrective action”. These parties moved to dismiss the complaint, claiming immunity under Section 230. The court granted the motion to dismiss.

Section 230’s Protections

Section 230 provides, in pertinent part that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). Section 230 also provides that “[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.” 47 U.S.C. § 230(e)(3).

The CDA also “proscribes liability in situations where an interactive service provider makes decisions ‘relating to the monitoring, screening, and deletion of content from its network.’ ” Obado v. Magedson, 612 Fed.Appx. 90, 94–95 (3d Cir. 2015). Courts have recognized Congress conferred broad immunity upon internet companies by enacting the CDA, because the breadth of the internet precludes such companies from policing content as traditional media have. See Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d 398, 407 (6th Cir. 2014); Batzel v Smith, 333 F.3d 1018, 1026 (9th Cir. 2003); Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th 1997); DiMeo v. Max, 433 F. Supp. 2d 523, 528 (E.D. Pa. 2006).

How Section 230 Applied Here

In this case, the court found that the CDA barred plaintiff’s claims against Google and YouTube. Both Google and YouTube were considered “interactive computer service[s].” Parker v. Google, Inc., 422 F. Supp. 2d 492, 551 (E.D. Pa. 2006). Plaintiff did not allege that Google or YouTube played any role in producing the allegedly defamatory content. Instead, Plaintiff alleged both websites failed to remove the defamatory content, despite his repeated requests.

Plaintiff did not cite any authority in his opposition to Google and YouTube’s motion, and instead argued that the CDA did not bar claims for the “failure to remove the videos” or to “take corrective action.” The court held that to the contrary, the CDA expressly protected internet companies from such liability. Under the CDA, plaintiff could not assert a claim against Google or YouTube for decisions “relating to the monitoring, screening, and deletion of content from its network. ” Obado, 612 Fed.Appx. at 94–95 (3d Cir. 2015); 47 U.S.C. § 230(c)(1) (“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”). For these reasons, the court found the CDA barred plaintiff’s claims against Google and YouTube.

Weerahandi v. Shelesh, 2017 WL 4330365 (D.N.J. September 29, 2017)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Scroll to top