Website operators not liable for third party comments

Spreadbury v. Bitterroot Public Library, 2012 WL 734163 (D. Montana, March 6, 2012)

Plaintiff was upset at some local government officials, and ended up getting arrested for allegedly trespassing at the public library. Local newspapers covered the story, including on their websites. Some online commenters said mean things about plaintiff, so plaintiff sued a whole slew of defendants, including the newspapers (as website operators).

The court threw out the claims over the online comments. It held that the Communications Decency Act at 47 U.S.C. 230 immunized the website operators from liability over the third party content.

Defendant argued that the websites were not protected by Section 230 because they were not “providers of interactive computer services” of the same ilk as AOL and Yahoo. The court soundly rejected that argument. It found that the websites provided a “neutral tool” and offered a “simple generic prompt” for subscribers to comment about articles. The website operators did not develop or select the comments, require or encourage readers to make defamatory statements, or edit comments to make them defamatory.

Six interesting technology law issues raised in the Facebook IPO

Patent trolls, open source, do not track, SOPA, PIPA and much, much more: Facebook’s IPO filing has a real zoo of issues.

The securities laws require that companies going public identify risk factors that could adversely affect the company’s stock. Facebook’s S-1 filing, which it sent to the SEC today, identified almost 40 such factors. A number of these risks are examples of technology law issues that almost any internet company would face, particularly companies whose product is the users.

(1) Advertising regulation. In providing detail about the nature of this risk, Facebook mentions “adverse legal developments relating to advertising, including legislative and regulatory developments” and “the impact of new technologies that could block or obscure the display of our ads and other commercial content.” Facebook is likely concerned about the various technological and legal restrictions on online behavioral advertising, whether in the form of mandatory opportunities for users to opt-out of data collection or or the more aggressive “do not track” idea. The value of the advertising is of course tied to its effectiveness, and any technological, regulatory or legislative measures to enhance user privacy is a risk to Facebook’s revenue.

(2) Data security. No one knows exactly how much information Facebook has about its users. Not only does it have all the content uploaded by its 845 million users, it has the information that could be gleaned from the staggering 100 billion friendships among those users. [More stats] A data breach puts Facebook at risk of a PR backlash, regulatory investigations from the FTC, and civil liability to its users for negligence and other causes of action. But Facebook would not be left without remedy, having in its arsenal civil actions under the Computer Fraud and Abuse Act and the Stored Communications Act (among other laws) against the perpetrators. It is also likely the federal government would step in to enforce the criminal provisions of these acts as well.

(3) Changing laws. The section of the S-1 discussing this risk factor provides a laundry list of the various issues that online businesses face. Among them: user privacy, rights of publicity, data protection, intellectual property, electronic contracts, competition, protection of minors, consumer protection, taxation, and online payment services. Facebook is understandably concerned that changes to any of these areas of the law, anywhere in the world, could make doing business more expensive or, even worse, make parts of the service unlawful. Though not mentioned by name here, SOPA, PIPA, and do-not-track legislation are clearly in Facebook’s mind when it notes that “there have been a number of recent legislative proposals in the United States . . . that would impose new obligations in areas such as privacy and liability for copyright infringement by third parties.”

(4) Intellectual property protection. The company begins its discussion of this risk with a few obvious observations, namely, how the company may be adversely affected if it is unable to secure trademark, copyright or patent registration for its various intellectual property assets. Later in the disclosure, though, Facebook says some really interesting things about open source:

As a result of our open source contributions and the use of open source in our products, we may license or be required to license innovations that turn out to be material to our business and may also be exposed to increased litigation risk. If the protection of our proprietary rights is inadequate to prevent unauthorized use or appropriation by third parties, the value of our brand and other intangible assets may be diminished and competitors may be able to more effectively mimic our service and methods of operations.

(5) Patent troll lawsuits. Facebook notes that internet and technology companies “frequently enter into litigation based on allegations of infringement, misappropriation, or other violations of intellectual property or other rights.” But it goes on to give special attention to those “non-practicing entities” (read: patent trolls) “that own patents and other intellectual property rights,” which “often attempt to aggressively assert their rights in order to extract value from technology companies.” Facebook believes that as its profile continues to rise, especially in the glory of its IPO, it will increasingly become the target of patent trolls. For now it does not seem worried: “[W]e do not believe that the final outcome of intellectual property claims that we currently face will have a material adverse effect on our business.” Instead, those endeavors are a suck on resources: “[D]efending patent and other intellectual property claims is costly and can impose a significant burden on management and employees….” And there is also the risk that these lawsuits might turn out badly, and Facebook would have to pay judgments, get licenses, or develop workarounds.

(6) Tort liability for user-generated content. Facebook acknowledges that it faces, and will face, claims relating to information that is published or made available on the site by its users, including claims concerning defamation, intellectual property rights, rights of publicity and privacy, and personal injury torts. Though it does not specifically mention the robust immunity from liability over third party content provided by 47 U.S.C. 230, Facebook indicates a certain confidence in the protections afforded by U.S. law from tort liability. It is the international scene that gives Facebook concern here: “This risk is enhanced in certain jurisdictions outside the United States where our protection from liability for third-party actions may be unclear and where we may be less protected under local laws than we are in the United States.”

You have to hand it to the teams of professionals who have put together Facebook’s IPO filing. I suppose the billions of dollars at stake can serve as a motivation for thoroughness. In any event, the well-articulated discussion of these risks in the S-1 is an interesting read, and can serve to guide the many lesser-valued companies out there.

Video: my appearance on the news talking about isanyoneup.com

Last night I appeared in a piece that aired on the 9 o’clock news here in Chicago, talking about the legal issues surrounding isanyoneup.com. (That site is definitely NSFW and I’m not linking to it because it doesn’t deserve the page rank help.) The site presents some interesting legal questions, like whether and to what extent it is shielded by Section 230 of the Communications Decency Act for the harm that arises from the content it publishes (I don’t think it is shielded completely). The site also engages in some pretty blatant copyright infringement, and does not enjoy safe harbor protection under the Digital Millennium Copyright Act.

Here’s the video:

Amazon and other booksellers off the hook for sale of Obama drug use book

Section 230 of the Communications Decency Act shields Amazon, Barnes & Noble and Books-A-Million from some, but not all claims brought over promotion and sale of scandalous book about presidential candidate.

Parisi v. Sinclair, — F.Supp.2d —, 2011 WL 1206193 (D.D.C. March 31, 2011)

In 2008, Larry Sinclair made the ultra-scandalous claim that he had done drugs and engaged in sexual activity with then-presidential candidate Barack Obama. Daniel Parisi, owner of the infamous Whitehouse.com website, challenged Sinclair to take a polygraph test.

Not satisfied with the attention his outlandish claims had garnered, Sinclair self-published a book detailing his alleged misadventures. The book was available through print-on-demand provider Lightening Source.

Amazon, Barnes & Noble, and Books-A-Million (“BAM”) each offered Sinclair’s book for sale through their respective websites. (Barnes & Noble and BAM did not sell the book at their brick and mortar stores.) Each company’s website promoted the book using the following sentence:

You’ll read how the Obama campaign used internet porn king Dan Parisi and Ph.D. fraud Edward I. Gelb to conduct a rigged polygraph exam in an attempt to make the Sinclair story go away.

Parisi and his Whitehouse Network sued for, among other things, defamation and false light invasion of privacy. BAM moved to dismiss pursuant to Rule 12(b)(6) while Amazon and Barnes & Noble moved for summary judgment. The court granted the booksellers’ motions.

Section 230 applied because booksellers were not information content providers

The booksellers’ primary argument was that Section 230 of the Communications Decency Act shielded them from liability for plaintiffs’ claims concerning the promotional sentence. The court found in defendants’ favor on this point.

Section 230 provides in relevant part that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The major issue in this case was whether the online booksellers had provided the information comprising the promotional sentence. The court found that the pleadings (as to BAM) and the evidence (as to Amazon and Barnes & Noble) did not credibly dispute that the booksellers did not create and develop the promotional sentence.

But not so fast, Section 230, on some of those other claims!

The court’s treatment of Section 230 in relation to plaintiffs’ false light claim and the claims relating to the actual sale of the book were even more intriguing.

Plaintiffs argued that their false light claim was essentially a right of publicity claim. And Section 230(e)(2) says that immunity does not apply to claims pertaining to intellectual property. There is some confusion as to whether this exception to immunity applies only to federal intellectual property claims or to both federal and state IP claims. On one hand, Perfect 10, Inc. v. CCBill says that only federal intellectual property claims are excepted from immunity (which would mean that state law IP claims would be barred by Section 230). On the other hand, cases like Atlantic Recording Corp. v. Project Playlist, Doe v. Friendfinder Network and Universal Communication System v. Lycos suggest that both state and federal IP claims should withstand a Section 230 challenge.

In this case, the court indicated that it would have sided with the cases that provide for both federal and state claims making it past Section 230: “I am not inclined to extend the scope of the CDA immunity as far as the Ninth Circuit. . . . ”

But ultimately the court did not need to take sides as to the scope of Section 230(e)(2), as it found the use of plaintiff Parisi’s name fit into the newsworthiness privilege. One cannot successfully assert a misappropriation claim when his name or likeness is used in a newsworthy publication unless the use has “no real relationship” to the subject matter of the publication.

The court also seemed to constrain Section 230 immunity as it related to the online booksellers’ liability for selling the actual book. (Remember, the discussion above, in which the court found immunity to apply, dealt with the promotional sentence.) The court rejected defendants’ arguments that the reasoning of Gentry v. eBay should protect them. In Gentry, eBay was afforded immunity from violation of a warranty statute. But it merely provided the forum for the sale of goods, unlike the online booksellers in this case, which were the distributors of the actual allegedly defamatory book.

Even though Section 230 did not serve to protect BAM, Barnes & Noble and Amazon from liability for defamation arising from sales of the book, the court dismissed the defamation claim because of the lack of a showing that the booksellers acted with actual malice. It was undisputed that the plaintiffs were limited-purpose public figures. Persons with that status must show that the defendant acted with actual malice. That standard was not met here.

Section 230 shields Google from liability for anonymous defamation

Black v. Google Inc., 2010 WL 3746474 (N.D.Cal. September 20, 2010)

Back in August, the U.S. District Court for the Northern District of California dismissed a lawsuit against Google brought by two pro se plaintiffs, holding that the action was barred under the immunity provisions of 47 USC 230. That section says that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Plaintiffs had complained about a comment on Google (probably a review) disparaging their roofing business.

Plaintiffs filed and “objection” to the dismissal, which the court read as a motion to alter or amend under Fed. R. Civ. P. 59. The court denied plaintiffs’ motion.

In their “objection,” plaintiffs claimed — apparently without much support — that Congress did not intend Section 230 to apply in situations involving anonymous speech. The court did not buy this argument.

The court looked to the Ninth Circuit case of Carafano v. Metrosplash as an example of a website operator protected under Section 230 from liability for anonymous content: “To be sure, the website [in Carafano] provided neutral tools, which the anonymous dastard used to publish the libel, but the website did absolutely nothing to encourage the posting of defamatory content.” As in Carafano, Google was a passive conduit and could not be liable for failing to detect and remove the allegedly defamatory content.

Yelp successful in defamation and deceptive acts and practices case

Reit v. Yelp, Inc., — N.Y.S.2d —, 2010 WL 3490167 (September 2, 2010)

Section 230 of Communications Decency Act shielded site as interactive computer service; assertions regarding manipulation of reviews was not consumer oriented and therefore not actionable.

As I am sure you know, Yelp! is an interactive website designed to allow the general public to write, post, and view reviews about businesses, including professional ones, as well as restaurants and other establishments.

Lots of people and businesses that are the subject of negative reviews on sites like this get riled up and often end up filing lawsuits. Suits against website operators in cases like this are almost always unsuccessful. The case of Reit v. Yelp from a New York state court was no exception.

Plaintiff dentist sued Yelp and an unknown reviewer for defamation. He also sued Yelp under New York state law for “deceptive acts and practices”. Yelp moved to dismiss both claims. The court granted the motion.

Defamation claim – protection under Section 230

Interactive computer service providers are immunized from liability (i.e., they cannot be held responsible) for content that is provided by third parties. So long as the website is not an “information content provider” itself, any claim made against the website will be preempted by the Communications Decency Act, at 47 U.S.C. 230.

In this case, plaintiff claimed that Yelp selectively removed positive reviews of his dentistry practice after he contacted Yelp to complain about a negative reivew. He argued that this action made Yelp an information content provider (doing more than “simply selecting material for publication”) and therefore outside the scope of Section 230’s immunity. The court rejected this argument.

It likened the case to an earlier New York decision called Shiamili v. Real Estate Group of New York. In that case, like this one, an allegation that a website operator may keep and promote bad content did not raise an inference that it becomes an information content provider. The postings do not cease to be data provided by a third party merely because the construct and operation of the website might have some influence on the content of the postings.

So the court dismissed the defamation claim on grounds of Section 230 immunity.

Alleged deceptive acts and practices were not consumer oriented

The other claim against Yelp — for deceptive acts and practices — was intriguing, though the court did not let it stand. Plaintiff alleged that Yelp’s Business Owner’s Guide says that once a business signs up for advertsing with Yelp, an “entirely automated” system screens out reviews that are written by less established users.

The problem with this, plaintiff claimed, was that the process was not automated with the help of algorithms, but was done by humans at Yelp. That divergence between what the Business Owner’s Guide said and Yelps actual practices, plaintiff claimed, was consumer-oriented conduct that was materially misleading, in violation of New York’s General Business Law Section 349(a).

This claim failed, however, because the court found that the statements made by Yelp in the Business Owner’s Guide were not consumer-oriented, but were addressed to business owners like plaintiff. Without being a consumer-oriented statement, it did not violate the statute.

Other coverage of this case:

Enhanced by Zemanta

Communications Decency Act immunizes hosting provider from defamation liability

Johnson v. Arden, — F.3d —, 2010 WL 3023660 (8th Cir. August 4, 2010)

The Johnsons sell exotic cats. They filed a defamation lawsuit after discovering that some other cat-fanciers said mean things about them on Complaintsboard.com. Among the defendants was the company that hosted Complaintsboard.com – InMotion Hosting.

Sassy is my parents' cat. She hisses whenever I'm around, though they say she's a nice cat otherwise.

The district court dismissed the case against the hosting company, finding that the Communications Decency Act at 47 U.S.C. §230 (“Section 230”) immunized the hosting provider from liability. The Johnsons sought review with the Eighth Circuit Court of Appeals. On appeal, the court affirmed the dismissal.

Though Section 230 immunity has been around since 1996, this was the first time the Eighth Circuit had been presented with the question.

Section 230 provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” It also says that “[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.”

The Johnsons argued that Section 230 did not immunize the hosting company. Instead, they argued, it did just what it says – provides that a party in the position of the hosting company should not be treated as a publisher or speaker of information provided by third parties. The Johnsons argued that the host should be liable in this case regardless of Section 230, because under Missouri law, defendants can be jointly liable when they commit a wrong by concert of action and with common intent and purpose.

The court rejected the Johnsons’ argument, holding that Section 230 bars plaintiffs from making providers legally responsible for information that third parties created and developed. Adopting the Fourth Circuit’s holding in Nemet Chevrolet v. Consumeraffiars.com, the court held that “Congress thus established a general rule that providers of interactive computer services are liable only for speech that is properly attributable to them.”

No evidence in the record showed how the offending posts could be attributed to the hosting provider. It was undisputed that the host did not originate the material that the Johnsons deemed damaging.

Given this failure to show the content originated with the provider, the court found in favor of robust immunity, joining with the majority of other federal circuits that have addressed intermediary liability in the context of Section 230.

Forwarder of defamatory email protected under Section 230

Hung Tan Phan v. Lang Van Pham, — Cal.Rptr.3d —, 2010 WL 658244 (Cal.App. 4 Dist. Feb. 25, 2010)

Defendant, a veteran of the Vietnamese military, forwarded an email to some other Vietmamese veterans which apparently defamed another veteran. He didn’t just forward the email, though. He added some commentary at the beginning, which said (translated from the original Vietnamese):

Everything will come out to the daylight, I invite you and our classmates to read the following comments of Senior Duc (Duc Xuan Nguyen) President of the Federation of Associations of the Republic of Vietnam Navy and Merchant Marine.

The person who was the subject of the defamatory email sued the forwarder. The trial court dismissed the case, holding that the defendant was immune from liability under the Communications Decency Act at 47 U.S.C. 230.

That section gives immunity from suit to users and providers of interactive computer services who are distributing information provided by a third party. More than three years ago, in Barrett v. Rosenthal, the California Supreme Court held that Section 230 immunity applies to one who further distributes the contents of a defamatory email message.

The plaintiff sought review with the California Court of Appeal. The court affirmed.

The court looked to the Roommates.com case, to which it attributed a test that requires a defendant’s own acts to materially contribute to the illegality of the internet message for Section 230 immunity to be lost.

In this case, the court held that the introductory remarks did not meet the material contribution test articulated in Roommates.com. The court found that “[a]ll [the defendant] said was: The truth will come out in the end. What will be will be. Whatever.”

Email ribbon photo courtesy Flickr user Mzelle Biscotte under this Creative Commons License

How Section 230 is like arson laws when it comes to enjoining website operators

The case of Blockowicz v. Williams, — F.Supp.2d —, 2009 WL 4929111 (N.D. Ill. December 21, 2009), which I posted on last week is worthy of discussion in that it raises the question of whether website operators like Ripoff Report could get off too easily when they knowingly host harmful third party content. Immunity under 47 U.S.C. 230 is often criticized for going too far in shielding operators. Under Section 230, sites cannot be treated as the publisher or speaker of information provided by third party information content providers. This means that even when the site operator is put on notice of the content, it cannot face, for example, defamation liability for the continued availability of that content.

Don’t get me wrong — the Blockowicz case had nothing to do with Section 230. Although Ben Sheffner is routinely sharp in his legal analysis, I disagree with his assessment that Section 230 was the reason for the court’s decision. In the comments to Ben’s post that I just linked to, Ben gets into conversation with Ripoff Report’s general counsel, whom I believe correctly notes that the decision was not based on Section 230. Ben argues that had Section 230 not provided immunity, the plaintiffs would have been able to go after Ripoff Report directly, and therefore Section 230 is to blame. That’s kind of like saying if arson were legal, plaintiffs could just go burn down Ripoff Report’s datacenter. But you don’t hear anyone blaming arson laws for this decision.

Even though Section 230 didn’t form the basis of the court’s decision in favor of Ripoff Report, the notion of a website operator “acting in concert” with its users is intriguing. Clearly the policy of Section 230 is to place some distance, legally speaking, between site operator and producer of user-generated content. And the whole idea behind the requirement in copyright law that infringement must arise from a volitional act and not an automatic action of the system is a first cousin to this issue. See, e.g., Religious Tech. Center v. Netcom, 907 F.Supp. 1361, 1370 (N.D. Cal. 1995) (“[T]here should still be some element of volition or causation which is lacking where a defendant’s system is merely used to create a copy by a third party”).

For the web to continue to develop, we are going to need this continued protection of the intermediary. We’re going to see functions of the semantic web appear with more frequency in our everyday online lives. From a practical perspective, there will be even more distance — a continuing divergence between a provider’s will and the nature of the content. So as we get into the technologies that will make the web smarter, and our experience of it more robust and helpful, we’ll need notions of intermediary immunity more and not less.

That notion of an increasing need for intermediary immunity underscores how important it is that intermediaries act responsibly. No doubt people misunderstand the holdings of cases like this one. By refusing to voluntarily take down obviously defamatory material, and challenging a court order to do so, Ripoff Report puts a bad taste in everyone’s mouth. Sure there’s the First Amendment and all that, but where’s a sense of reasonable decency? Sure there’s the idea that free flowing information supports democracy and all that, but has anyone stopped to think what could happen when the politicians get involved again?

Do not taunt Happy Fun Ball

We are fortunate that Congress was as equinamimous and future-minded as it was in 1996 when it enacted the immunity provisions of Section 230. But results like the one in the Blockowicz case are going to be misunderstood. There’s a hue and cry already about this decision, in that it appears to leave no recourse. Section 230 wasn’t involved, but it still got the blame. Even the judge was “sympathetic to the [plaintiffs’] plight.”

So maybe we need, real quickly, another decision like the Roommates.com case, that reminds us that website operators don’t always get a free ride.

Injunction against defamatory content could not reach website owner

Blockowicz v. Williams, — F.Supp.2d —, 2009 WL 4929111 (N.D. Ill. December 21, 2009)

(This is a case from last month that has already gotten some attention in the legal blogosphere, and is worth reporting on here in spite of the already-existing commentary.)

Plaintiffs sued two individual defendants for defamation over content those defendants posted online. The court entered an order of default after the defendants didn’t answer the complaint. The court also issued an injunction against the defendants, requiring them to take down the defamatory material.

grasping

When plaintiffs were unable to reach the defendants directly, they asked the websites on which the content was posted — MySpace, Facebook, Complaints Board and Ripoff Report — to remove the material.

All of the sites except Ripoff Report took down the defamatory content. Plaintiffs filed a motion with the court to get Ripoff Report to remove the material. Ripoff Report opposed the motion, arguing that Rule 65 (the federal rule pertaining to injunctions) did not give the court authority to bind Ripoff Report as a non-party. The court sided with Ripoff Report and denied the motion.

Federal Rule of Civil Procedure 65 states that injunctions bind the parties against whom they are issued as well as “other persons who are in active concert or participation with” those parties. In this case, the court looked to the Seventh Circuit opinion of S.E.C. v. Homa, 514 F.3d 674 (7th Cir. 2008) for guidance on the contours of Rule 65’s scope. Under Homa, a non-party can be bound by an injunction if it is “acting in concert” or is “legally identified” (like as an agent or employee) with the enjoined party.

Plaintiffs argued that Ripoff Report was acting in concert with the defamers. Plaintiffs looked to Ripoff Report’s terms of service, by which posters to the site give an exclusive copyright license to and agree to indemnify Ripoff Report. Those terms also state that Ripoff Report will not remove any content for any reason. Plaintiffs read this combination of terms to stand for some sort of arrangement whereby Ripoff Report agreed to be a safe haven for defamatory material.

The court rejected this argument, finding there was no evidence in the record that Ripoff Report intended to protect defamers. Moreover, there was no evidence that Ripoff Report had communicated with the defendants in any way since the entry of a permanent injunction, or otherwise worked to violate the earlier court order requiring defendants to remove the materials.

Other commentary on this case:

Grasping photo courtesy Flickr user Filmnut under this Creative Commons license.

Scroll to top