No Section 230 immunity for healthcare software provider

Company could be liable for modifications made to its software that provided abbreviated third-party warnings for prescription drugs.

Cases dealing with the Communications Decency Act often involve websites. See, for example, the recent decision from the Sixth Circuit involving thedirty.com, and earlier cases about Roommates.com and Amazon. But this case considered a sort of unique suggested application of Section 230 immunity. The question was whether a provider of software that facilitated the delivery of prescription monographs (including warning information) could claim immunity. It’s unusual for Section 230 to show up in a products liability/personal injury action, but that is how it happened here.

Plaintiff suffered blindness and other injuries allegedly from taking medication she says she would not have taken had it been accompanied with certain warnings. She sued several defendants, including a software company that provided the technology whereby warnings drafted by third parties were provided to pharmacy retailers.

Defendant software company moved to dismiss on several grounds, including immunity under the Communications Decency Act, 47 U.S.C. 230. The trial court denied the motion to dismiss and defendant sought review. On appeal, the court affirmed the denial of the motion to dismiss, holding that Section 230 immunity did not apply.

At the request of the retailer that sold plaintiff her medicine, defendant software company modified its software to provide only abbreviated product warnings. Plaintiff’s claims against defendant arose from that modification.

Defendant argued that Section 230 immunity should protect it because defendant did not play any role in the decisions of the product warning. Instead, defendant was an independent provider of software that distributed drug information to pharmacy customers. Its software enabled pharmacies to access a third party’s database of product warnings. Defendant did not author the warnings but instead, provided the information under an authorization in a data license agreement. Defendant thus functioned as a pass through entity to distribute warnings that were prepared by third parties to retailers selling prescription drugs, and were printed and distributed to the individual customer when a prescription was filled.

The court found unpersuasive defendant’s claim that Section 230 immunized it from liability for providing electronic access to third party warnings. Section 230 provides, in relevant part, that (1) “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” and (2) “[n]o cause of action may be brought and no liability may be imposed under any State or local rule that is inconsistent with this section.”

It held that plaintiff’s claim against defendant did not arise from defendant’s role as the software or service provider that enabled the retailer to access the third-party drafted warnings. Instead, the court found that plaintiff’s claim arose from defendant’s modification of its software to allow the retailer to distribute abbreviated drug monographs that automatically omitted warnings of serious risks. The appellate court agreed with the trial court which found, “this is not a case in which a defendant merely distributed information from a third party author or publisher.” Instead, in the court’s view, defendant’s conduct in modifying the software so that only abbreviated warnings would appear, it participated in creating or modifying the content.

Hardin v. PDX, Inc., 2014 WL 2768863 (Cal. App. 1st June 19, 2014)

Sixth Circuit holds thedirty.com entitled to Section 230 immunity

Plaintiff Jones (a high school teacher and Cincinnati Bengals cheerleader) sued the website thedirty.com and its operator for defamation over a number of third party posts that said mean things about plaintiff. Defendants moved for summary judgment, arguing that the Communications Decency Act — 47 USC § 230(c)(1) — afforded them immunity from liability for the content created by third parties. Articulating a “goofy legal standard,” the district court denied the motion, and the case was tried twice. The first trial ended in a mistrial, and the second time the jury found in favor of plaintiff.

Defendants sought review with the Sixth Circuit Court of Appeals on the issue of whether whether the district court erred in denying defendants’ motion for judgment as a matter of law by holding that the CDA did not bar plaintiff’s state tort claims. On appeal, the court reversed the district court and ordered that judgment as a matter of law be entered in defendants’ favor.

Section 230(c)(1) provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” At its core, § 230 grants immunity to defendant service providers in lawsuits seeking to hold the service provider liable for its exercise of a publisher’s traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content.

But the grant of immunity is not without limits. It applies only to the extent that an interactive computer service provider is not also the information content provider of the content at issue. A defendant is not entitled to protection from claims based on the publication of information if the defendant is responsible, in whole or in part, for the creation or development of the information.

The district court held that “a website owner who intentionally encourages illegal or actionable third-party postings to which he adds his own comments ratifying or adopting the posts becomes a ‘creator’ or ‘developer’ of that content and is not entitled to immunity.” Thus, the district court concluded that “[d]efendants, when they re-published the matters in evidence, had the same duties and liabilities for re-publishing libelous material as the author of such materials.”

The appellate court held that the district court’s test for what constitutes “creation” or “development” was too broad. Instead, the court looked to the Ninth Circuit’s decision in Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) and adopted the material contribution test from that opinion:

[W]e interpret the term “development” as referring not merely to augmenting the content generally, but to materially contributing to its alleged unlawfulness. In other words, a website helps to develop unlawful content, and thus falls within the exception to section 230, if it contributes materially to the alleged illegality of the conduct.

In the Sixth Circuit’s language, “[A] material contribution to the alleged illegality of the content does not mean merely taking action that is necessary to the display of allegedly illegal content. Rather, it means being responsible for what makes the displayed content allegedly unlawful.”

In this case, the defendants did not author the statements at issue. But they did select the statements for publication. The court held that defendants did not materially contribute to the defamatory content of the statements simply because those posts were selected for publication. Moreover, the website did not require users to post illegal or actionable content as a condition of use. The website’s content submission form simply instructed users generally to submit content. The court found the tool to be neutral (both in orientation and design) as to what third parties submit. Accordingly, the website design did not constitute a material contribution to any defamatory speech that was uploaded.

Jones v. Dirty World, No. 13-5946 (6th Cir. June 16, 2014)

Evan Brown is an attorney in Chicago advising clients on matters dealing with technology, the internet and new media. Contact him.

Website operators not liable for third party comments

Spreadbury v. Bitterroot Public Library, 2012 WL 734163 (D. Montana, March 6, 2012)

Plaintiff was upset at some local government officials, and ended up getting arrested for allegedly trespassing at the public library. Local newspapers covered the story, including on their websites. Some online commenters said mean things about plaintiff, so plaintiff sued a whole slew of defendants, including the newspapers (as website operators).

The court threw out the claims over the online comments. It held that the Communications Decency Act at 47 U.S.C. 230 immunized the website operators from liability over the third party content.

Defendant argued that the websites were not protected by Section 230 because they were not “providers of interactive computer services” of the same ilk as AOL and Yahoo. The court soundly rejected that argument. It found that the websites provided a “neutral tool” and offered a “simple generic prompt” for subscribers to comment about articles. The website operators did not develop or select the comments, require or encourage readers to make defamatory statements, or edit comments to make them defamatory.

School district has to stop filtering web content

PFLAG v. Camdenton R–III School Dist., 2012 WL 510877 (W.D.Mo. Feb. 16, 2012)

Several website publishers that provide supportive resources directed at lesbian, gay, bisexual, and transgender (LGBT) youth filed a First Amendment lawsuit against a school district over the district’s use of internet filtering software. Plaintiffs asked the court for an injunction against the district’s alleged practice of preventing students’ access to websites that expressed a positive viewpoint toward LGBT individuals.

The court granted a preliminary injunction. It found that by using URL Blacklist software, the district (despite its assertions to the contrary) engaged in intentional viewpoint discrimination, in violation of the website publishers’ First Amendment rights. The URL Blacklist software — which relied in large part on dmoz.org — classified positive materials about LGBT issues within the software’s “sexuality” filter, and it put LGBT-negative materials under “religion,” which were not blocked.

It found that the plaintiffs had a fair chance of success on the merits of their First Amendment claims. The school district had claimed it was simply trying to comply with a federal law that required the blocking of content harmful to minors. But the court found that the chosen method of filtering was not narrowly tailored to meet that interest.

One may wonder whether Section 230 of the Communications Decency Act could have protected the school district in this lawsuit. After all, 47 U.S.C. 230(c)(2)(A) provides that:

No provider or user of an interactive computer service shall be held liable on account of—

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected. . . . (Emphasis added.)

Section 230 would probably not have been much help, because the plaintiffs were seeking injunctive relief, not money damages. An old case called Mainstream Loudoun v. Bd. of Trustees of Loudoun, 24 F. Supp. 2d 552 (E.D. Va. 1998) tells us that:

[Section] 230 provides immunity from actions for damages; it does not, however, immunize [a] defendant from an action for declaratory and injunctive relief. . . . If Congress had intended the statute to insulate Internet providers from both liability and declaratory and injunctive relief, it would have said so.

One could understand the undesirability of applying Section 230 to protect filtering of this sort even without the Mainstream Loudoun holding. If Section 230 completely immunized government-operated interactive computer service providers, allowing them to engage freely in viewpoint-based filtering, free speech would suffer in obvious ways. And it would be unfortunate to subject Section 230 to this kind of analysis, whereby it would face the severe risk of being unconstitutional as applied.

Video: my appearance on the news talking about isanyoneup.com

Last night I appeared in a piece that aired on the 9 o’clock news here in Chicago, talking about the legal issues surrounding isanyoneup.com. (That site is definitely NSFW and I’m not linking to it because it doesn’t deserve the page rank help.) The site presents some interesting legal questions, like whether and to what extent it is shielded by Section 230 of the Communications Decency Act for the harm that arises from the content it publishes (I don’t think it is shielded completely). The site also engages in some pretty blatant copyright infringement, and does not enjoy safe harbor protection under the Digital Millennium Copyright Act.

Here’s the video:

Amazon and other booksellers off the hook for sale of Obama drug use book

Section 230 of the Communications Decency Act shields Amazon, Barnes & Noble and Books-A-Million from some, but not all claims brought over promotion and sale of scandalous book about presidential candidate.

Parisi v. Sinclair, — F.Supp.2d —, 2011 WL 1206193 (D.D.C. March 31, 2011)

In 2008, Larry Sinclair made the ultra-scandalous claim that he had done drugs and engaged in sexual activity with then-presidential candidate Barack Obama. Daniel Parisi, owner of the infamous Whitehouse.com website, challenged Sinclair to take a polygraph test.

Not satisfied with the attention his outlandish claims had garnered, Sinclair self-published a book detailing his alleged misadventures. The book was available through print-on-demand provider Lightening Source.

Amazon, Barnes & Noble, and Books-A-Million (“BAM”) each offered Sinclair’s book for sale through their respective websites. (Barnes & Noble and BAM did not sell the book at their brick and mortar stores.) Each company’s website promoted the book using the following sentence:

You’ll read how the Obama campaign used internet porn king Dan Parisi and Ph.D. fraud Edward I. Gelb to conduct a rigged polygraph exam in an attempt to make the Sinclair story go away.

Parisi and his Whitehouse Network sued for, among other things, defamation and false light invasion of privacy. BAM moved to dismiss pursuant to Rule 12(b)(6) while Amazon and Barnes & Noble moved for summary judgment. The court granted the booksellers’ motions.

Section 230 applied because booksellers were not information content providers

The booksellers’ primary argument was that Section 230 of the Communications Decency Act shielded them from liability for plaintiffs’ claims concerning the promotional sentence. The court found in defendants’ favor on this point.

Section 230 provides in relevant part that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The major issue in this case was whether the online booksellers had provided the information comprising the promotional sentence. The court found that the pleadings (as to BAM) and the evidence (as to Amazon and Barnes & Noble) did not credibly dispute that the booksellers did not create and develop the promotional sentence.

But not so fast, Section 230, on some of those other claims!

The court’s treatment of Section 230 in relation to plaintiffs’ false light claim and the claims relating to the actual sale of the book were even more intriguing.

Plaintiffs argued that their false light claim was essentially a right of publicity claim. And Section 230(e)(2) says that immunity does not apply to claims pertaining to intellectual property. There is some confusion as to whether this exception to immunity applies only to federal intellectual property claims or to both federal and state IP claims. On one hand, Perfect 10, Inc. v. CCBill says that only federal intellectual property claims are excepted from immunity (which would mean that state law IP claims would be barred by Section 230). On the other hand, cases like Atlantic Recording Corp. v. Project Playlist, Doe v. Friendfinder Network and Universal Communication System v. Lycos suggest that both state and federal IP claims should withstand a Section 230 challenge.

In this case, the court indicated that it would have sided with the cases that provide for both federal and state claims making it past Section 230: “I am not inclined to extend the scope of the CDA immunity as far as the Ninth Circuit. . . . ”

But ultimately the court did not need to take sides as to the scope of Section 230(e)(2), as it found the use of plaintiff Parisi’s name fit into the newsworthiness privilege. One cannot successfully assert a misappropriation claim when his name or likeness is used in a newsworthy publication unless the use has “no real relationship” to the subject matter of the publication.

The court also seemed to constrain Section 230 immunity as it related to the online booksellers’ liability for selling the actual book. (Remember, the discussion above, in which the court found immunity to apply, dealt with the promotional sentence.) The court rejected defendants’ arguments that the reasoning of Gentry v. eBay should protect them. In Gentry, eBay was afforded immunity from violation of a warranty statute. But it merely provided the forum for the sale of goods, unlike the online booksellers in this case, which were the distributors of the actual allegedly defamatory book.

Even though Section 230 did not serve to protect BAM, Barnes & Noble and Amazon from liability for defamation arising from sales of the book, the court dismissed the defamation claim because of the lack of a showing that the booksellers acted with actual malice. It was undisputed that the plaintiffs were limited-purpose public figures. Persons with that status must show that the defendant acted with actual malice. That standard was not met here.

Yahoo not liable for blocking marketing email

Section 230 of Communications Decency Act (47 U.S.C. 230) shields Yahoo’s spam filtering efforts

Holomaxx v. Yahoo, 2011 WL 865794 (N.D.Cal. March 11, 2011)

Plaintiff provides email marketing services for its clients. It sends out millions of emails a day, many of those to recipients having Yahoo email addresses. Yahoo used its spam filtering technology to block many of the messages plaintiff was trying to send to Yahoo account users. So plaintiff sued Yahoo, alleging various causes of action such as intentional interference with prospective business advantage.

Yahoo moved to dismiss, arguing, among other things, that it was immune from liability under Section 230(c)(2) of the Communications Decency Act. The court granted the motion to dismiss.

Section 230(c)(2) provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be held liable on account of … any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”

Plaintiff argued that immunity should not apply here because Yahoo acted in bad faith by using “faulty filtering technology and techniques,” motivated “by profit derived from blocking both good and bad e-mails.” But the court found no factual basis to support plaintiff’s allegations that Yahoo used “cheap and ineffective technologies to avoid the expense of appropriately tracking and eliminating only spam email.”

The court rejected another of plaintiff’s arguments against applying Section 230, namely, that Yahoo should not be afforded blanket immunity for blocking legitimate business emails. Looking to the cases of Goddard v. Google and National Numismatic Certification v. eBay, plaintiff argued that the court should apply the canon of statutory construction known as ejusdem generis to find that legitimate business email should not be treated the same as the more nefarious types of content enumerated in Section 230(c)(2). (Content that is, for example, obscene, lewd, lascivious, filthy, excessively violent, harassing).

On this point the court looked to the sheer volume of the purported spam to conclude Yahoo was within Section 230’s protection to block the messages — plaintiff acknowledged that it sent approximately six million emails per day through Yahoo’s servers and that at least .1% of those emails either were sent to invalid addresses or resulted in user opt-out. On an annual basis, that amounted to more than two million invalid or unwanted emails.

Section 230 shields Google from liability for anonymous defamation

Black v. Google Inc., 2010 WL 3746474 (N.D.Cal. September 20, 2010)

Back in August, the U.S. District Court for the Northern District of California dismissed a lawsuit against Google brought by two pro se plaintiffs, holding that the action was barred under the immunity provisions of 47 USC 230. That section says that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Plaintiffs had complained about a comment on Google (probably a review) disparaging their roofing business.

Plaintiffs filed and “objection” to the dismissal, which the court read as a motion to alter or amend under Fed. R. Civ. P. 59. The court denied plaintiffs’ motion.

In their “objection,” plaintiffs claimed — apparently without much support — that Congress did not intend Section 230 to apply in situations involving anonymous speech. The court did not buy this argument.

The court looked to the Ninth Circuit case of Carafano v. Metrosplash as an example of a website operator protected under Section 230 from liability for anonymous content: “To be sure, the website [in Carafano] provided neutral tools, which the anonymous dastard used to publish the libel, but the website did absolutely nothing to encourage the posting of defamatory content.” As in Carafano, Google was a passive conduit and could not be liable for failing to detect and remove the allegedly defamatory content.

Communications Decency Act immunizes hosting provider from defamation liability

Johnson v. Arden, — F.3d —, 2010 WL 3023660 (8th Cir. August 4, 2010)

The Johnsons sell exotic cats. They filed a defamation lawsuit after discovering that some other cat-fanciers said mean things about them on Complaintsboard.com. Among the defendants was the company that hosted Complaintsboard.com – InMotion Hosting.

Sassy is my parents' cat. She hisses whenever I'm around, though they say she's a nice cat otherwise.

The district court dismissed the case against the hosting company, finding that the Communications Decency Act at 47 U.S.C. §230 (“Section 230”) immunized the hosting provider from liability. The Johnsons sought review with the Eighth Circuit Court of Appeals. On appeal, the court affirmed the dismissal.

Though Section 230 immunity has been around since 1996, this was the first time the Eighth Circuit had been presented with the question.

Section 230 provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” It also says that “[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.”

The Johnsons argued that Section 230 did not immunize the hosting company. Instead, they argued, it did just what it says – provides that a party in the position of the hosting company should not be treated as a publisher or speaker of information provided by third parties. The Johnsons argued that the host should be liable in this case regardless of Section 230, because under Missouri law, defendants can be jointly liable when they commit a wrong by concert of action and with common intent and purpose.

The court rejected the Johnsons’ argument, holding that Section 230 bars plaintiffs from making providers legally responsible for information that third parties created and developed. Adopting the Fourth Circuit’s holding in Nemet Chevrolet v. Consumeraffiars.com, the court held that “Congress thus established a general rule that providers of interactive computer services are liable only for speech that is properly attributable to them.”

No evidence in the record showed how the offending posts could be attributed to the hosting provider. It was undisputed that the host did not originate the material that the Johnsons deemed damaging.

Given this failure to show the content originated with the provider, the court found in favor of robust immunity, joining with the majority of other federal circuits that have addressed intermediary liability in the context of Section 230.

How Section 230 is like arson laws when it comes to enjoining website operators

The case of Blockowicz v. Williams, — F.Supp.2d —, 2009 WL 4929111 (N.D. Ill. December 21, 2009), which I posted on last week is worthy of discussion in that it raises the question of whether website operators like Ripoff Report could get off too easily when they knowingly host harmful third party content. Immunity under 47 U.S.C. 230 is often criticized for going too far in shielding operators. Under Section 230, sites cannot be treated as the publisher or speaker of information provided by third party information content providers. This means that even when the site operator is put on notice of the content, it cannot face, for example, defamation liability for the continued availability of that content.

Don’t get me wrong — the Blockowicz case had nothing to do with Section 230. Although Ben Sheffner is routinely sharp in his legal analysis, I disagree with his assessment that Section 230 was the reason for the court’s decision. In the comments to Ben’s post that I just linked to, Ben gets into conversation with Ripoff Report’s general counsel, whom I believe correctly notes that the decision was not based on Section 230. Ben argues that had Section 230 not provided immunity, the plaintiffs would have been able to go after Ripoff Report directly, and therefore Section 230 is to blame. That’s kind of like saying if arson were legal, plaintiffs could just go burn down Ripoff Report’s datacenter. But you don’t hear anyone blaming arson laws for this decision.

Even though Section 230 didn’t form the basis of the court’s decision in favor of Ripoff Report, the notion of a website operator “acting in concert” with its users is intriguing. Clearly the policy of Section 230 is to place some distance, legally speaking, between site operator and producer of user-generated content. And the whole idea behind the requirement in copyright law that infringement must arise from a volitional act and not an automatic action of the system is a first cousin to this issue. See, e.g., Religious Tech. Center v. Netcom, 907 F.Supp. 1361, 1370 (N.D. Cal. 1995) (“[T]here should still be some element of volition or causation which is lacking where a defendant’s system is merely used to create a copy by a third party”).

For the web to continue to develop, we are going to need this continued protection of the intermediary. We’re going to see functions of the semantic web appear with more frequency in our everyday online lives. From a practical perspective, there will be even more distance — a continuing divergence between a provider’s will and the nature of the content. So as we get into the technologies that will make the web smarter, and our experience of it more robust and helpful, we’ll need notions of intermediary immunity more and not less.

That notion of an increasing need for intermediary immunity underscores how important it is that intermediaries act responsibly. No doubt people misunderstand the holdings of cases like this one. By refusing to voluntarily take down obviously defamatory material, and challenging a court order to do so, Ripoff Report puts a bad taste in everyone’s mouth. Sure there’s the First Amendment and all that, but where’s a sense of reasonable decency? Sure there’s the idea that free flowing information supports democracy and all that, but has anyone stopped to think what could happen when the politicians get involved again?

Do not taunt Happy Fun Ball

We are fortunate that Congress was as equinamimous and future-minded as it was in 1996 when it enacted the immunity provisions of Section 230. But results like the one in the Blockowicz case are going to be misunderstood. There’s a hue and cry already about this decision, in that it appears to leave no recourse. Section 230 wasn’t involved, but it still got the blame. Even the judge was “sympathetic to the [plaintiffs’] plight.”

So maybe we need, real quickly, another decision like the Roommates.com case, that reminds us that website operators don’t always get a free ride.

Scroll to top