Section 230 shields Google from liability for anonymous defamation

Black v. Google Inc., 2010 WL 3746474 (N.D.Cal. September 20, 2010)

Back in August, the U.S. District Court for the Northern District of California dismissed a lawsuit against Google brought by two pro se plaintiffs, holding that the action was barred under the immunity provisions of 47 USC 230. That section says that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Plaintiffs had complained about a comment on Google (probably a review) disparaging their roofing business.

Plaintiffs filed and “objection” to the dismissal, which the court read as a motion to alter or amend under Fed. R. Civ. P. 59. The court denied plaintiffs’ motion.

In their “objection,” plaintiffs claimed — apparently without much support — that Congress did not intend Section 230 to apply in situations involving anonymous speech. The court did not buy this argument.

The court looked to the Ninth Circuit case of Carafano v. Metrosplash as an example of a website operator protected under Section 230 from liability for anonymous content: “To be sure, the website [in Carafano] provided neutral tools, which the anonymous dastard used to publish the libel, but the website did absolutely nothing to encourage the posting of defamatory content.” As in Carafano, Google was a passive conduit and could not be liable for failing to detect and remove the allegedly defamatory content.

Yelp successful in defamation and deceptive acts and practices case

Reit v. Yelp, Inc., — N.Y.S.2d —, 2010 WL 3490167 (September 2, 2010)

Section 230 of Communications Decency Act shielded site as interactive computer service; assertions regarding manipulation of reviews was not consumer oriented and therefore not actionable.

As I am sure you know, Yelp! is an interactive website designed to allow the general public to write, post, and view reviews about businesses, including professional ones, as well as restaurants and other establishments.

Lots of people and businesses that are the subject of negative reviews on sites like this get riled up and often end up filing lawsuits. Suits against website operators in cases like this are almost always unsuccessful. The case of Reit v. Yelp from a New York state court was no exception.

Plaintiff dentist sued Yelp and an unknown reviewer for defamation. He also sued Yelp under New York state law for “deceptive acts and practices”. Yelp moved to dismiss both claims. The court granted the motion.

Defamation claim – protection under Section 230

Interactive computer service providers are immunized from liability (i.e., they cannot be held responsible) for content that is provided by third parties. So long as the website is not an “information content provider” itself, any claim made against the website will be preempted by the Communications Decency Act, at 47 U.S.C. 230.

In this case, plaintiff claimed that Yelp selectively removed positive reviews of his dentistry practice after he contacted Yelp to complain about a negative reivew. He argued that this action made Yelp an information content provider (doing more than “simply selecting material for publication”) and therefore outside the scope of Section 230’s immunity. The court rejected this argument.

It likened the case to an earlier New York decision called Shiamili v. Real Estate Group of New York. In that case, like this one, an allegation that a website operator may keep and promote bad content did not raise an inference that it becomes an information content provider. The postings do not cease to be data provided by a third party merely because the construct and operation of the website might have some influence on the content of the postings.

So the court dismissed the defamation claim on grounds of Section 230 immunity.

Alleged deceptive acts and practices were not consumer oriented

The other claim against Yelp — for deceptive acts and practices — was intriguing, though the court did not let it stand. Plaintiff alleged that Yelp’s Business Owner’s Guide says that once a business signs up for advertsing with Yelp, an “entirely automated” system screens out reviews that are written by less established users.

The problem with this, plaintiff claimed, was that the process was not automated with the help of algorithms, but was done by humans at Yelp. That divergence between what the Business Owner’s Guide said and Yelps actual practices, plaintiff claimed, was consumer-oriented conduct that was materially misleading, in violation of New York’s General Business Law Section 349(a).

This claim failed, however, because the court found that the statements made by Yelp in the Business Owner’s Guide were not consumer-oriented, but were addressed to business owners like plaintiff. Without being a consumer-oriented statement, it did not violate the statute.

Other coverage of this case:

Enhanced by Zemanta

Communications Decency Act immunizes hosting provider from defamation liability

Johnson v. Arden, — F.3d —, 2010 WL 3023660 (8th Cir. August 4, 2010)

The Johnsons sell exotic cats. They filed a defamation lawsuit after discovering that some other cat-fanciers said mean things about them on Complaintsboard.com. Among the defendants was the company that hosted Complaintsboard.com – InMotion Hosting.

Sassy is my parents' cat. She hisses whenever I'm around, though they say she's a nice cat otherwise.

The district court dismissed the case against the hosting company, finding that the Communications Decency Act at 47 U.S.C. §230 (“Section 230”) immunized the hosting provider from liability. The Johnsons sought review with the Eighth Circuit Court of Appeals. On appeal, the court affirmed the dismissal.

Though Section 230 immunity has been around since 1996, this was the first time the Eighth Circuit had been presented with the question.

Section 230 provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” It also says that “[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.”

The Johnsons argued that Section 230 did not immunize the hosting company. Instead, they argued, it did just what it says – provides that a party in the position of the hosting company should not be treated as a publisher or speaker of information provided by third parties. The Johnsons argued that the host should be liable in this case regardless of Section 230, because under Missouri law, defendants can be jointly liable when they commit a wrong by concert of action and with common intent and purpose.

The court rejected the Johnsons’ argument, holding that Section 230 bars plaintiffs from making providers legally responsible for information that third parties created and developed. Adopting the Fourth Circuit’s holding in Nemet Chevrolet v. Consumeraffiars.com, the court held that “Congress thus established a general rule that providers of interactive computer services are liable only for speech that is properly attributable to them.”

No evidence in the record showed how the offending posts could be attributed to the hosting provider. It was undisputed that the host did not originate the material that the Johnsons deemed damaging.

Given this failure to show the content originated with the provider, the court found in favor of robust immunity, joining with the majority of other federal circuits that have addressed intermediary liability in the context of Section 230.

Forwarder of defamatory email protected under Section 230

Hung Tan Phan v. Lang Van Pham, — Cal.Rptr.3d —, 2010 WL 658244 (Cal.App. 4 Dist. Feb. 25, 2010)

Defendant, a veteran of the Vietnamese military, forwarded an email to some other Vietmamese veterans which apparently defamed another veteran. He didn’t just forward the email, though. He added some commentary at the beginning, which said (translated from the original Vietnamese):

Everything will come out to the daylight, I invite you and our classmates to read the following comments of Senior Duc (Duc Xuan Nguyen) President of the Federation of Associations of the Republic of Vietnam Navy and Merchant Marine.

The person who was the subject of the defamatory email sued the forwarder. The trial court dismissed the case, holding that the defendant was immune from liability under the Communications Decency Act at 47 U.S.C. 230.

That section gives immunity from suit to users and providers of interactive computer services who are distributing information provided by a third party. More than three years ago, in Barrett v. Rosenthal, the California Supreme Court held that Section 230 immunity applies to one who further distributes the contents of a defamatory email message.

The plaintiff sought review with the California Court of Appeal. The court affirmed.

The court looked to the Roommates.com case, to which it attributed a test that requires a defendant’s own acts to materially contribute to the illegality of the internet message for Section 230 immunity to be lost.

In this case, the court held that the introductory remarks did not meet the material contribution test articulated in Roommates.com. The court found that “[a]ll [the defendant] said was: The truth will come out in the end. What will be will be. Whatever.”

Email ribbon photo courtesy Flickr user Mzelle Biscotte under this Creative Commons License

How Section 230 is like arson laws when it comes to enjoining website operators

The case of Blockowicz v. Williams, — F.Supp.2d —, 2009 WL 4929111 (N.D. Ill. December 21, 2009), which I posted on last week is worthy of discussion in that it raises the question of whether website operators like Ripoff Report could get off too easily when they knowingly host harmful third party content. Immunity under 47 U.S.C. 230 is often criticized for going too far in shielding operators. Under Section 230, sites cannot be treated as the publisher or speaker of information provided by third party information content providers. This means that even when the site operator is put on notice of the content, it cannot face, for example, defamation liability for the continued availability of that content.

Don’t get me wrong — the Blockowicz case had nothing to do with Section 230. Although Ben Sheffner is routinely sharp in his legal analysis, I disagree with his assessment that Section 230 was the reason for the court’s decision. In the comments to Ben’s post that I just linked to, Ben gets into conversation with Ripoff Report’s general counsel, whom I believe correctly notes that the decision was not based on Section 230. Ben argues that had Section 230 not provided immunity, the plaintiffs would have been able to go after Ripoff Report directly, and therefore Section 230 is to blame. That’s kind of like saying if arson were legal, plaintiffs could just go burn down Ripoff Report’s datacenter. But you don’t hear anyone blaming arson laws for this decision.

Even though Section 230 didn’t form the basis of the court’s decision in favor of Ripoff Report, the notion of a website operator “acting in concert” with its users is intriguing. Clearly the policy of Section 230 is to place some distance, legally speaking, between site operator and producer of user-generated content. And the whole idea behind the requirement in copyright law that infringement must arise from a volitional act and not an automatic action of the system is a first cousin to this issue. See, e.g., Religious Tech. Center v. Netcom, 907 F.Supp. 1361, 1370 (N.D. Cal. 1995) (“[T]here should still be some element of volition or causation which is lacking where a defendant’s system is merely used to create a copy by a third party”).

For the web to continue to develop, we are going to need this continued protection of the intermediary. We’re going to see functions of the semantic web appear with more frequency in our everyday online lives. From a practical perspective, there will be even more distance — a continuing divergence between a provider’s will and the nature of the content. So as we get into the technologies that will make the web smarter, and our experience of it more robust and helpful, we’ll need notions of intermediary immunity more and not less.

That notion of an increasing need for intermediary immunity underscores how important it is that intermediaries act responsibly. No doubt people misunderstand the holdings of cases like this one. By refusing to voluntarily take down obviously defamatory material, and challenging a court order to do so, Ripoff Report puts a bad taste in everyone’s mouth. Sure there’s the First Amendment and all that, but where’s a sense of reasonable decency? Sure there’s the idea that free flowing information supports democracy and all that, but has anyone stopped to think what could happen when the politicians get involved again?

Do not taunt Happy Fun Ball

We are fortunate that Congress was as equinamimous and future-minded as it was in 1996 when it enacted the immunity provisions of Section 230. But results like the one in the Blockowicz case are going to be misunderstood. There’s a hue and cry already about this decision, in that it appears to leave no recourse. Section 230 wasn’t involved, but it still got the blame. Even the judge was “sympathetic to the [plaintiffs’] plight.”

So maybe we need, real quickly, another decision like the Roommates.com case, that reminds us that website operators don’t always get a free ride.

Injunction against defamatory content could not reach website owner

Blockowicz v. Williams, — F.Supp.2d —, 2009 WL 4929111 (N.D. Ill. December 21, 2009)

(This is a case from last month that has already gotten some attention in the legal blogosphere, and is worth reporting on here in spite of the already-existing commentary.)

Plaintiffs sued two individual defendants for defamation over content those defendants posted online. The court entered an order of default after the defendants didn’t answer the complaint. The court also issued an injunction against the defendants, requiring them to take down the defamatory material.

grasping

When plaintiffs were unable to reach the defendants directly, they asked the websites on which the content was posted — MySpace, Facebook, Complaints Board and Ripoff Report — to remove the material.

All of the sites except Ripoff Report took down the defamatory content. Plaintiffs filed a motion with the court to get Ripoff Report to remove the material. Ripoff Report opposed the motion, arguing that Rule 65 (the federal rule pertaining to injunctions) did not give the court authority to bind Ripoff Report as a non-party. The court sided with Ripoff Report and denied the motion.

Federal Rule of Civil Procedure 65 states that injunctions bind the parties against whom they are issued as well as “other persons who are in active concert or participation with” those parties. In this case, the court looked to the Seventh Circuit opinion of S.E.C. v. Homa, 514 F.3d 674 (7th Cir. 2008) for guidance on the contours of Rule 65’s scope. Under Homa, a non-party can be bound by an injunction if it is “acting in concert” or is “legally identified” (like as an agent or employee) with the enjoined party.

Plaintiffs argued that Ripoff Report was acting in concert with the defamers. Plaintiffs looked to Ripoff Report’s terms of service, by which posters to the site give an exclusive copyright license to and agree to indemnify Ripoff Report. Those terms also state that Ripoff Report will not remove any content for any reason. Plaintiffs read this combination of terms to stand for some sort of arrangement whereby Ripoff Report agreed to be a safe haven for defamatory material.

The court rejected this argument, finding there was no evidence in the record that Ripoff Report intended to protect defamers. Moreover, there was no evidence that Ripoff Report had communicated with the defendants in any way since the entry of a permanent injunction, or otherwise worked to violate the earlier court order requiring defendants to remove the materials.

Other commentary on this case:

Grasping photo courtesy Flickr user Filmnut under this Creative Commons license.

Expedited electronic discovery includes subpoena to ISP and imaging of defendants’ hard drives

Allcare Dental Management, LLC v. Zrinyi, No. 08-407, 2008 WL 4649131 (D. Idaho October 20, 2008)

Plaintiffs filed a defamation lawsuit against some known defendants as well as some anonymous John Doe defendants in federal court over statements posted to Complaintsboard.com. The plaintiffs did not know the names or contact information of the Doe defendants, so they needed to get that information from the Does’ Internet service provider.  But the ISP would not turn that information over without a subpoena because of the restrictions of the Cable Communications Policy Act, 47 U.S.C. § 501 et seq. [More on the CCPA.]

Under Federal Rule of Civil Procedure 26(d)(1), a party generally may not seek discovery in a case until the parties have had a Rule 26(f) conference to discuss such things as discovery. Because of the Rule 26(d)(1) requirement, the plaintiffs found themselves in a catch-22 of sorts: how could they know with whom to have the Rule 26(f) conference if they did not know the defendants’ identity.

So the plaintiffs’ filed a motion with the court to allow a subpoena to issue to the ISP prior to the Rule 26(f) conference. Finding that there was good cause for the expedited discovery, the court granted the motion. It found that the subpoena was needed to ascertain the identities of the unknown defendants. [More on Doe subpoenas.] Furthermore, it was important to act sooner than later, because ISPs retain data for only a limited time.

The Plaintiffs also contended that that the known defendants would likely delete relevant information from their computer hard drives before the parties could engage in the ordinary process of discovery. So the plaintiffs’ motion also sought an order requiring the known defendants to turn over their computers to have their hard drives copied.

The court granted this part of the motion as well, ordering the known defendants to turn their computers over to the plaintiffs’ retained forensics professional immediately. The forensics professional was to make the copies of the hard drives and place those copies with the court clerk, not to be accessed or reviewed until stipulation of the parties or further order from the court.

No CDA immunity for letting co-defendant use computer to post material

Capital Corp. Merchant Banking, Inc. v. Corporate Colocation, Inc., No. 07-1626, 2008 WL 4058014 (M.D.Fla., August 27, 2008)

Professor Goldman points us to a recent decision in a case where the plaintiff alleged that one of the individual defendants “allowed [a co-defendant] to use ‘a computer registered in her name’ to make . . . defamatory statements.” The defendants filed a 12(b)(6) motion to dismiss, arguing that the Communications Decency Act (CDA) at 47 U.S.C. 230 barred the claims. The court denied the motion.

With little analysis, the court cited to the 9th Circuit’s Roommates.com decision, holding that “[t]he CDA provides immunity for the removal of content, not the creation of the content.” While that is not an incorrect statement, it is troublesome in this context inasmuch as it tells half the story.

Yes, 47 U.S.C. 230(c) does provide protection to “Good Samaritan” operators of interactive computer services who remove offensive content. The user whose content has been removed would not have a cause of action against the operator who took down the content in good faith. See 47 U.S.C. 230(c)(2).

But 47 U.S.C. 230(c)(1) provides that no provider of an interactive computer service shall be treated as a publisher or speaker of any information provided by a third party. Courts have usually held that when a defamation plaintiff brings a claim against the operator of the computer service used to post defamatory content (who was not responsible for creating the content), such a claim is barred, as the plaintiff would not be able to satisfy the publication element of a defamation prima facie case.

Maybe in this situation the court found that the defendant who let a co-defendant use her computer did not meet the definition of a service provider as contemplated by the CDA. But it would have been nice to see that analysis written down, rather than having to merely surmise or speculate.

Scroll to top