Interactive websites supported exercise of personal jurisdiction

Plaintiff sued a California corporation in federal court in Utah. Defendant moved to dismiss, asserting, among other things, lack of personal jurisdiction. The court denied the motion.

The court found that it had specific personal jurisdiction over defendant. Plaintiff provided evidence that defendant ran a number of highly interactive websites, including at least two online stores. Defendant provided visitors with a shopping cart feature that allowed them to select multiple products for purchase. Visitors to defendants’ sites could purchase items over the website using Google checkout or a number of major credit cards. Defendant offered to sell products into Utah through its multiple internet stores. In short, defendant purposefully used its website to reach a large number of potential buyers, including those in Utah, and benefited from that exposure.

Citing to Dedvukaj v. Maloney, 447 F.Supp.2d 813 (E.D.Mich. 2006), the court observed that “[s]ellers cannot expect to avail themselves of the benefits of the internet-created world market that they purposefully exploit and profit from without accepting the concomitant legal responsibilities that such an expanded market may bring with it.”

A.L. Enterprises Inc. v. Sebron, 2008 WL 4356958 (D. Utah, September 17, 2008) 

 

Subpoena to university in P2P case must give time to notify parents

UMG Recordings, Inc. v. Doe, No. 08-3999, 2008 WL 4104207 (N.D.Cal. September 4, 2008)

Plaintiff record companies, using Media Sentry, found the IP address of a John Doe file-sharing defendant, and filed suit against Doe in federal court for copyright infringement. As in any case where a defendant is known only by his or her IP address, the record companies needed some discovery to ascertain the name and physical address matching that IP address. But the federal rules of procedure say that without a court order, a party cannot seek discovery until the parties have conferred pursuant to Fed. R. Civ. P. 26(f).

So the record companies sought the court order allowing them to issue a subpoena to Doe’s Internet service provider prior to the 26(f) conference. The court granted the order, but with a caveat.

The evidence showed that Doe was a student at the University of California, Santa Cruz. Under the Family Educational Rights and Privacy Act at 20 U.S.C. § 1232g, a college generally cannot disclose “any personally identifiable information in education records other than directory information.” There’s an exception to that rule when the college is answering a lawfully issued subpoena, provided that “parents and the students are notified of all such … subpoenas in advance of the compliance therewith by the educational institution or agency.”

The court granted the record companies’ motion for leave to serve the subpoena prior to the Rule 26(f) conference, but required that the subpoena’s return date “be reasonably calculated to permit the University to notify John Doe and John Doe’s parents if it chooses prior to responding to the subpoena.”

No CDA immunity for letting co-defendant use computer to post material

Capital Corp. Merchant Banking, Inc. v. Corporate Colocation, Inc., No. 07-1626, 2008 WL 4058014 (M.D.Fla., August 27, 2008)

Professor Goldman points us to a recent decision in a case where the plaintiff alleged that one of the individual defendants “allowed [a co-defendant] to use ‘a computer registered in her name’ to make . . . defamatory statements.” The defendants filed a 12(b)(6) motion to dismiss, arguing that the Communications Decency Act (CDA) at 47 U.S.C. 230 barred the claims. The court denied the motion.

With little analysis, the court cited to the 9th Circuit’s Roommates.com decision, holding that “[t]he CDA provides immunity for the removal of content, not the creation of the content.” While that is not an incorrect statement, it is troublesome in this context inasmuch as it tells half the story.

Yes, 47 U.S.C. 230(c) does provide protection to “Good Samaritan” operators of interactive computer services who remove offensive content. The user whose content has been removed would not have a cause of action against the operator who took down the content in good faith. See 47 U.S.C. 230(c)(2).

But 47 U.S.C. 230(c)(1) provides that no provider of an interactive computer service shall be treated as a publisher or speaker of any information provided by a third party. Courts have usually held that when a defamation plaintiff brings a claim against the operator of the computer service used to post defamatory content (who was not responsible for creating the content), such a claim is barred, as the plaintiff would not be able to satisfy the publication element of a defamation prima facie case.

Maybe in this situation the court found that the defendant who let a co-defendant use her computer did not meet the definition of a service provider as contemplated by the CDA. But it would have been nice to see that analysis written down, rather than having to merely surmise or speculate.

Veoh eligible for DMCA Safe Harbor

[Brian Beckham is a contributor to Internet Cases and can be contacted at brian.beckham [at] gmail dot com.]

Io Group, Inc. v. Veoh Networks, Inc., 2008 WL 4065872 (N.D.Cal. Aug. 27, 2008)

The U.S. District Court for the Northern District of California ruled that Veoh’s hosting of user-provided content is protected by the DMCA safe harbor provision, and that it does not have a duty to police for potential copyright infringement on behalf of third-parties, but rather must act to remove infringing content when so put on notice.

IO produces adult films; Veoh hosts, inter alia, its own “Internet TV channels” and user-posted content (much like YouTube). In June 2006, IO discovered clips from ten (10) of its copyrighted films ranging from 6 seconds to 40 minutes in length hosted on Veoh. Rather than sending Veoh a “DMCA Notice & Takedown” letter, IO filed the instant copyright infringement suit. (Coincidentally, Veoh had already removed all adult content sua sponte — including IO’s prior to the suit). Had Veoh received such a notice, so the story goes, it would have removed the content, and terminated the posting individual’s account.

When a user submits a video for posting, Veoh’s system extracts certain metadata (e.g., file format and length), assigns a file number, extracts several still images (seen on the site as an icon), and converts the video to Flash. Prior to posting, Veoh’s employees randomly spot check the videos for compliance with Veoh’s policies (i.e., that the content is not infringing third-party copyrights). On at least one occasion, such a spot check revealed infringing content (an unreleased movie) which was not posted.

Veoh moved for summary judgment under the DMCA’s Safe Harbors which “provide protection from liability for: (1) transitory digital network communications; (2) system caching; (3) information residing on systems or networks at the direction of users; and (4) information location tools.” Ellison, 357 F.3d at 1076-77. Finding that Veoh is a Service Provider under the DMCA, the Court had little trouble in finding that it qualified for the Safe Harbors. IO admitted that Veoh “(a) has adopted and informed account holders of its repeat infringer policy and (b) accommodates, and does not interfere with, “standard technical measures” used to protect copyrighted works”, but took issue with the manner in which Veoh implemented its repeat infringer policy.

Veoh clearly established that it had a functioning DMCA Notice & Takedown system:

  • Veoh has identified its designated Copyright Agent to receive notification of claimed violations and included information about how and where to send notices of claimed infringement.
  • Veoh often responds to infringement notices the same day they are received.
  • When Veoh receives notice of infringement, after a first warning, the account is terminated and all content provided by that user disabled.
  • Veoh terminates access to other identical infringing files and permanently blocks them from being uploaded again.
  • Veoh has terminated over 1,000 users for copyright infringement.

The Court held that Veoh did not have a duty to investigate whether terminated users were re-appearing under pseudonyms, but that as long as it continued to effectively address alleged infringements, it continued to qualify for the DMCA Safe Harbors; moreover, it did not have to track users’ IP addresses to readily identify possibly fraudulent new user accounts.

The Court further noted that: “In essence, a service provider [Veoh] is eligible for safe harbor under section 512(c) if it (1) does not know of infringement; or (2) acts expeditiously to remove or disable access to the material when it (a) has actual knowledge, (b) is aware of facts or circumstances from which infringing activity is apparent, or (c) has received DMCA-compliant notice; and (3) either does not have the right and ability to control the infringing activity, or – if it does – that it does not receive a financial benefit directly attributable to the infringing activity.”

The Court found that (1) there was no question that Veoh did not know of the alleged infringement — since IO did not file a DMCA Notice (2) it acted expeditiously to remove user-posted infringing content, (3) it did not have actual knowledge of infringement, (4) it was not aware of infringing activity, and (5) it did not have the right and ability to control the infringing activity (the Court did not address any financial benefit).

In sum: the Court “[did] not find that the DMCA was intended to have Veoh shoulder the entire burden of policing third-party copyrights on its website (at the cost of losing its business if it cannot). Rather, the issue [was] whether Veoh [took] appropriate steps to deal with [alleged] copyright infringement.”

There is much speculation as to how, if at all, this case will affect the Viacom / YouTube case. YouTube praised the decision, Viacom noted the differences. Each case turns on its own facts, but to the extent there are similarities, this decision is wind in YouTube’s sails.

Case is: IO Group Inc.(Plaintiff), v. Veoh Networks, Inc. (Defendant)

Slamming Wikipedia’s reliability not enough in immigration case

Badasa v. Mukasey, — F.3d —, 2008 WL 3981817 (8th Cir. Aug. 29, 2008)

Illegal alien Badasa sought asylum in the United States. To establish her identity, she submitted to the Immigration Judge a “laissez-passer” issued by the Ethiopian government. Opposing the application for asylum, the Department of Homeland Security submitted a number of items, including a Wikipedia article, to show that a laissez-passer is merely a document issued for a one-time purpose based on information provided by the applicant. The Immigration Judge was not convinced that the laissez-passer established Badasa’s identity, and denied the application for asylum.

Badasa appealed to the Board of Immigration Appeals, which agreed that asylum should be denied. It soundly criticized Wikipedia’s reliability to establish the meaning of the document at issue, but found there was enough other evidence to support the Immigration Judge’s conclusion that Badasa had failed to establish her identity. But the Board of Immigration Appeals failed to discuss this other evidence, therefore running afoul of the administrative law textbook case of SEC v. Chenery, 318 U.S. 80 (1943).

So the Eighth Circuit sent the case back to the Board of Immigration Appeals to make additional findings. The court observed that the Board of Immigration Appeals found that “Badasa was not prejudiced by the [Immigration Judge’s] reliance on Wikipedia, but [the Board of Immigration Appeals] made no independent determination that Badasa failed to establish her identity.” In short, the Board of Immigration Appeals had focused only on why the use of Wikipedia made the case less “solid,” and did not address the lack of solidity found in any of the other evidence connected with the laissez-passer used to establish identity.

Scroll to top