Blog

Exploiting blockchain software defect supports unjust enrichment claim

blockchain unjust enrichment
Most court cases involving blockchain have to do with securities regulation or some other business aspect of what the parties are doing. The case of Shin v. ICON Foundation, however, deals with the technology side of blockchain. The U.S. District Court for the Northern District of California recently issued an opinion having to do with how the law should handle a person who exploits a software flaw to quickly (and, as other members of the community claim, unfairly) generate tokens.

Exploiting software flaw to generate tokens

Mark Shin was a member of the ICON Community – a group that includes users who create and transact in the ICX cryptocurrency. The ICON Network hosts the delegated proof of stake blockchain protocol. The process by which delegates are selected for the environment’s governance involves ICX users “staking” tokens. As an incentive to participate in the process, ICX holders receive rewards that can be redeemed for more ICX. The system does not give rewards, however, when a user “unstakes” his or her tokens.

When a new version of the ICON Network software was released, Shin discovered that he was immediately awarded one ICX token each time he would unstake a token. Exploiting this software defect, he staked and unstaked tokens until he generated new ICX valued at the time at approximately $9 million.

Bring in the lawyers

Other members of the community did not take kindly to Shin’s conduct, and took steps to mitigate the effect. Shin filed suit for conversion and trespass to chattel. And the members of the cryptocurrency community filed a counterclaim, asserting a number of theories against Shin, including a claim for unjust enrichment. Shin moved to dismiss the unjust enrichment claim, arguing that the community’s claim failed to state a claim upon which relief could be granted. In general, unjust enrichment occurs when a person has been unjustly conferred a benefit, including through fraud or mistake. Under California law (which applied in this case), the elements of unjust enrichment are (1) receipt of a benefit, and (2) unjust retention of the benefit at the expense of another.

Moving toward trial

In this case, the court disagreed with Shin’s arguments. It held that the members of the community had sufficiently pled a claim for unjust enrichment. It’s important to note that this opinion does not mean that Shin is liable for unjust enrichment – it only means that the facts as alleged, if they are proven true, support a viable legal claim. In other words, the opinion confirms that the law recognizes that Shin’s alleged conduct would be unjust enrichment. We will have to see whether Shin is actually found liable for unjust enrichment, either at the summary judgment stage or at trial.

Examining the elements of unjust enrichment, the court found that the alleged benefit to Shin was clear, and that the community members had adequately pled that Shin unjustly retained this benefit. The allegations supported the theory that Shin materially diluted the value of the tokens held by other members of the community, and that he “arrogated value to himself from the other members.” According to the members of the community, if Shin had not engaged in the alleged conduct, the present-day value of ICX would be even higher. (It will be interesting to see how that will be proven – perhaps one more knowledgeable than this author in crypto can weigh in.)

Shin v. ICON Foundation, 2021 WL 6117508 (N.D. Cal., December 27, 2021)

Copyright ownership transfers must be in writing

copyright

If you are hiring an independent contractor to create copyrightable subject matter, and you want to own the copyright in the resulting work product, be sure to have that contractor sign a written contract that specifically states that copyright ownership is being transferred. Even if you have paid the contractor for the work, and you both intend that ownership be transferred, the contractor will still own the copyright in the deliverables unless there is a writing, signed by the contractor, to the contrary. This is a key concern if your contractor has created subject matter that will be critical to your business – software, graphics, text, photos, any kind of protectable digital asset. If you do not secure ownership, the contractor may later object to how you are using the works differently than intended at the time of the contract, and claim infringement. Or the contractor could grant a license in the same work to another party, even one of your competitors.

The Copyright Act contains a couple of provisions that relate to this issue. The first one pertains to the definition of “work made for hire”. If an employee creates copyrightable subject matter within the scope of his or her employment, that is a work made for hire, and the employer owns the copyright. But note how that relates to employers and employees. Contractors are in a different category. There are other kinds of works that are “ordered or commissioned” that can be considered works made for hire, even if created by an independent contractor. But in any event, the Copyright Act says that these are works made for hire only “if the parties expressly agree in a written instrument signed by them that the work shall be considered a work made for hire.”

Let’s say you have not established that the contractor’s work is a “work made for hire”. You could still have the contractor assign his or her rights in the deliverables. Again, the Copyright Act requires this to be in writing. You cannot just agree on a handshake that ownership of copyright has been transferred. The statute provides that “[a] transfer of copyright ownership, other than by operation of law, is not valid unless an instrument of conveyance, or a note or memorandum of the transfer, is in writing and signed by the owner of the rights conveyed or such owner’s duly authorized agent.” Note that the contractor – the one making the assignment – has to sign the written document.

Paying attention to these issues on the front end of hiring an independent contractor will help ensure clear rights in the future, avoid future tangles and disagreements, and ultimately save time and money by avoiding costly dispute resolution.

Evan Brown is a technology and intellectual property attorney in Chicago. Twitter: @internetcases

How do we attribute value to an NFT?

value nft
 
How do we attribute value to an NFT? We can analyze this question from a number of perspectives. To start, we could draw a line of demarcation between categories of possessions that exist physically and those that exist intangibly. One intuitively understands how tangible things get value. This is often tied to the item’s usefulness. For example, a car has value because it transports. A knife is useful for cutting. A pen enables writing. We move out one level of abstraction and see that pieces of physical money (coins and bills) have value in how they are used to transact in goods and services.

Empty intangibility?

The value of intangible possessions requires more abstract thinking, but the reasonable person has no difficulty in grasping how that valuation works. One may possess a right or privilege even though he can’t hold it in his hands. Another may possess a digital good – think about premium skins in video games – but she cannot physically touch it. Understanding the value in these kinds of intangible things is not too challenging. So we can reject the notion that an NFT’s intangible nature means it has no value. But where do we go from here in exploring where an NFT’s value is derived?

Ain’t she a beauty?

Consider a digital work of art (as a .jpg, for example) that is transacted as an NFT. The underlying work of art – whether seen as bits on a screen or printed out as ink-on-paper – can hold the viewer in aesthetic arrest. But the NFT itself – data within the blockchain – does not so stimulate the human soul. It seems, therefore, that we have eliminated the notion that beauty or some similar concept gives value to an NFT.

There’s something about Mona Lisa

But let’s not yet move away from thinking about characteristics of works of art in seeking to answer our question about attributing value. The ability to hold its viewer in aesthetic arrest is only one way a physical piece of art can have value. Consider the Mona Lisa – the actual physical painting hanging on a wall this very moment at the Louvre in Paris. I could fly to Paris, take the Metro to the Louvre, buy a ticket and make my way to the hall where the Mona Lisa is displayed. I could look at it there and behold its beauty. But I could also behold that beauty by doing a Google Image search on my computer in the basement. Or I could go to Target and buy a Mona Lisa print to hang on my own wall. The fact that I could enjoy the Mona Lisa without going to Paris shows that the ability to induce aesthetic arrest does not come only from the original. One can get the same thrill from seeing a copy. Yet there remains a thrill one can get only by beholding the actual physical painting in Paris. There’s a “something more” arising from seeing the actual materials assembled as they were by da Vinci’s own hand. There is a value in being in the presence of and perceiving the actual corporeal stuff that da Vinci saw and manipulated.
 
There is only one original Mona Lisa. It is irreplaceable. That is, it is non-fungible. The original Mona Lisa in Paris connects us to da Vinci in time and space. When we are in the presence of the original Mona Lisa, we are in the presence of the actual stuff (wood, pigment) that da Vinci handled. That ability to give the viewer an experience of presence – one more than mere aesthetic arrest – contributes substantially to the painting’s value. So it must be, then, that these particular molecules comprising the original Mona Lisa, and one’s being in proximity to them, are what gives the original painting a special value? Well, no.

Love at the molecular level

The actual molecules comprising the Mona Lisa – the carbon in the poplar board, the material in the pigment, etc. – compared in terms of chemical structure and behavior – differ none from all the other like category molecules in the universe. The Mona Lisa’s molecules are not special in themselves, but instead are valuable because they were worked in accordance with da Vinci’s intention.

Being intentional

Here we may have reached a good place from which to jump back over to NFTs. We know that intangibleness does not disqualify NFTs from having value. And we know that NFTs do not induce an aesthetic experience. But they do carry some specialness due to their uniqueness. In a certain respect, Jack Dorsey’s NFT of the Very First Tweet carries the same flavor of specialness as the Mona Lisa in the Louvre, even though – unlike the Mona Lisa – the Very First Tweet NFT does not portray beauty. And, unlike the original Mona Lisa, the Very First Tweet NFT, being intangible, does not contain any particular molecules that Dorsey put there (because of course the bits stored that embody the tweet or the NFT are not tied to a particular memory substrate that he dealt with back in March 2006). But what does remain is the fact that the content of the Very First Tweet has now become inextricably (even if only symbolically) linked to the NFT because of Dorsey’s intention.
 
We have now arrived at a point where we can at least preliminarily posit some statements articulating how an NFT gets value: Value attaches to an NFT because of the uniqueness of its digital structure having come into existence as the particular effect of an act of its author’s intention. More simply: One may want an NFT because there is something abstractly intriguing about its creation and existence, even though there is nothing one can touch that corresponds with that intrigue.
 
Evan Brown is a technology and intellectual property attorney in Chicago. Follow him on Twitter at @internetcases. This content originally appeared at evan.law.

Online retailer’s use of photo of products it did not sell was not an unfair or deceptive act

Defendant online guitar retailer used on its website a photo of premium guitar necks – products that the online retailer did not sell. Plaintiff – the purveyor of the premium guitars found in the photo – sued defendant under the New Hampshire consumer protection act which makes unfair or deceptive acts in trade or commerce unlawful. The case went to trial. The court found in favor of defendant.

website photo deceptive practices

The court found three of plaintiff’s key witnesses not credible. Each of them had some sort of personal relationship with the plaintiff that in the court’s view tainted their testimony. One of them testified in a questionable way – he testified remotely via videoconferencing software and was “clearly reading from notes or a script during his direct examination.” And “[r]ather than looking directly into the camera when he answered questions, he consistently fixed his gaze on the left portion of his computer screen each time he began his answer.”

The photo played a minor part in defendant’s website. It was relatively small in comparison to the rest of the material in which it appeared. It took up approximately a third of an online document’s width and was not much bigger than a thumbnail-sized image. The image quality was low – the guitar necks were blurry and it was difficult to tell whether anything was written on the them, such as a logo.

The court found that plaintiff failed to prove that consumers would have understood defendant’s use of the photo to assert an affiliation between defendant and plaintiff. In the court’s mind, even if defendant had proven the assertion of an affiliation, plaintiff failed to prove that defendant acted with the intent required for the applicable statutory violation.

D’Pergo Custom Guitars, Inc. v. Sweetwater Sound, Inc., 2021 WL 3038640 (D.N.H., July 19, 2021)

See also: Alienware goes after “free” computer offer

DMCA anticircumvention case over copied YouTube videos moves forward

Defendant fired plaintiff over two videos advocating for COVID-19 workplace protections that plaintiff posted on YouTube. Around the time of the termination, the employer allegedly used a smartphone to record the videos in question while they were being played on a computer screen. Defendant then allegedly further copied, distributed and performed these videos in connection with legal proceedings involving plaintiff, without plaintiff’s consent.

DMCA anticircumvention

Plaintiff sued his former employer for copyright infringement. And because YouTube technology provides technological protection measures to prevent unauthorized copying of videos, plaintiff sued under Section 1201 of the Copyright Act – one of the anticircumvention provisions of the Digital Millennium Copyright Act (“DMCA”).

No fair use (yet)

Defendant argued its conduct was fair use of the videos. It asserted it submitted the videos in response to plaintiff’s OSHA complaint and in support of a no-trespass order. But the court refused to make a fair use determination at the motion to dismiss stage, since no facts supporting fair use could be found in the complaint.

DMCA circumvention

The court also allowed plaintiff’s DMCA circumvention claim to move forward.

Section 1201(a)(1)(A) of the DMCA states “[n]o person shall circumvent a technological measure that effectively controls access to a work protected under [the Copyright Act].” “Circumvent,” as used in §1201, “means to descramble a scrambled work, to decrypt an encrypted work, or otherwise to avoid, bypass, remove, deactivate, or impair a technological measure, without the authority of the copyright owner[.]” §1201(a)(3)(A). “[A] technological measure ‘effectively controls access to a work’ if the measure, in the ordinary course of its operation, requires the application of information, or a process or a treatment, with the authority of the copyright owner, to gain access to the work.” §1201(a)(3)(B).

Defendant argued the court should throw out the DMCA circumvention claim because plaintiff did not identify a specific technological measure that defendant allegedly circumvented. The court rejected that argument, however, saying that that much specificity was not required to survive a motion to dismiss.

Defendant also argued the DMCA claim should fail because the statute prohibits “circumvention,” which is different from copying, and the complained-of conduct was simply copying. The court viewed the law differently, however, citing Chamberlain Group, Inc. v. Skylink Technologies., Inc., 381 F.3d 1178 (Fed. Cir. 2004), where the court held that while infringement and circumvention are distinct, an act of infringement can also involve an act of an authorized circumvention.

The facts of this case may cause one to consider cases such as R. Christopher Goodwin & Assoc., Inc. v. Search, Inc., 2019 WL 5576834 (E.D. Louisiana October 29, 2019) and wonder whether circumvention has really occurred. It does not appear defendant in this case did anything to disable measures that would have prevented it from viewing the videos. Presumably the streamed videos were available to anyone who could visit YouTube. And the act of creating the copies did not even touch on any of the protection measures YouTube put in place. There was no cracking or descrambling – just the capturing of video as it passed through the air (at that moment being analog) from the computer screen to the camera of the smart phone. Perhaps there is an analog hole defense here?

Edland v. Basin Electric Power Cooperative, 2021 WL 3080225 (D.S.D. July 21, 2021)

No contract formed via URL to terms and conditions in hard copy advertisement

Online terms of service found at URL in hard copy advertisement were not enforceable.

terms of service

Plaintiff visited a Subway restaurant. One of the Subway employees referred plaintiff to an in-store, hard-copy advertisement. On the advertisement, Subway offered to send special offers to plaintiff if she texted a keyword to a short code. Plaintiff sent the text message to Subway, and Subway began responding, including by sending her, via text message, a hyperlink to an electronic coupon.

Later, plaintiff wanted to stop receiving the messages, so she requested that the messages cease. But they kept arriving. Plaintiff then sued under the Telephone Consumer Protection Act (“TCPA”). Subway moved to compel arbitration, arguing that a contract was formed because the printed in-store advertisement that contained the keyword and short code to text included a reference to and URL for “terms and conditions”. Those terms and conditions required plaintiff to settle the dispute by arbitration.

The lower court denied the motion to compel arbitration. Subway sought review with the Second Circuit Court of Appeals. On appeal, the court affirmed the denial of a motion to dismiss, finding that plaintiff was not bound by the terms and conditions.

The appellate court held that plaintiff was not on notice of the terms and conditions, which contained the arbitration clause, because Subway failed to demonstrate that such terms and conditions would be clear and conspicuous to a reasonable person in plaintiff’s position. More specifically, the court held that the following facts showed plaintiff did not know what the terms said:

  • Subway failed to provide evidence regarding the size of the advertisement at issue, or the print size contained within that advertisement;
  • the reference to “terms and conditions” was buried on the advertisement in a paragraph that was printed in significantly smaller font relative to the other text on the advertisement, and the reference itself was surrounded by a substantial amount of unrelated information;
  • the advertisement only vaguely referenced “terms and conditions,” and did not state that a consumer would be agreeing to those terms if she sent a text message to Subway’s short code, nor did it otherwise direct the consumer to such terms;
  • access to the terms and conditions on the Subway website required plaintiff to type in the URL text provided on the hard-copy print advertisement into an internet browser on her cell phone or some other device with internet browsing capabilities; and
  • once linked to the Subway website, the heading stated that it contained “terms of use for this website,” thus potentially suggesting to a reasonable person (searching for conditions of the promotional offer) that the website did not contain any terms or conditions beyond those relevant to the use of the website.

This combination of barriers led the court to conclude that the terms and conditions were not reasonably conspicuous under the totality of the circumstances and, thus, a reasonable person would not realize she was being bound to such terms and conditions by texting Subway in order to begin receiving promotional offers.

Soliman v. Subway Franchisee Advertising Fund Trust, Ltd., — F.3d —, 2021 WL 2324549 (2nd Cir. June 8, 2021)

Related: Court finds clickwrap independent contractor agreement enforceable

What must social media platforms do to comply with Florida’s Senate Bill 7072?

Ron DeSantis social media billThe media has been covering Florida’s new law (Senate Bill 7072) targeting social media platforms, which Governor DeSantis signed today. The law is relatively complex and imposes a number of new obligations on social media platforms. The law is likely to face First Amendment challenges. And then there’s the Section 230 problem. In any event, through all the political noise surrounding the law’s passage, it is worth taking a careful look at what the statute actually says. 

Findings

The bill starts with some findings of the legislature that give context to what the law is about. There are some interesting findings that reflect an evolved view of the internet and social media as being critical spaces for information exchange, in the nature of public utilities and common carriers:

  • Social media platforms have transformed into the new public town square.
  • Social media platforms have become as important for conveying public opinion as public utilities are for supporting modern society.
  • Social media platforms hold a unique place in preserving first amendment protections for all Floridians and should be treated similarly to common carriers.
  • The state has a substantial interest in protecting its residents from inconsistent and unfair actions by social media.

Important definitions

The statute gives some precise and interesting definitions to important terms:

A “social media platform” is any information service, system, Internet search engine, or access software provider that provides or enables computer access by multiple users to a computer server, including an Internet platform or a social media site, operates as a sole proprietorship, partnership, limited liability company, corporation, association, or other legal entity, does business in Florida and has either annual gross revenues in excess of $100 million, or has at least 100 million monthly individual platform participants globally.

Interestingly, the statute appears to clarify that Disney World and other major players are not part of what constitute “social media platforms”: 

The term does not include any information service, system, Internet search engine, or access software provider operated by a company that owns and operates a theme park or entertainment complex as defined elsewhere in Florida law.

Some other definitions:

To “censor” is for a social media platform to delete, regulate, restrict, edit, alter, inhibit the publication or republication of, suspend a right to post, remove, or post an addendum to any content or material posted by a user. The term also includes actions to inhibit the ability of a user to be viewable by or to interact with another user of the social media platform.

“Deplatforming” means the action or practice by a social media platform to permanently delete or ban a user or to temporarily delete or ban a user from the social media platform for more than 14 days.

A “shadow ban” is action by a social media platform, through any means, whether the action is determined by a natural person or an algorithm, to limit or eliminate the exposure of a user or content or material posted by a user to other users of the social media platform. This term includes acts of shadow banning by a social media platform which are not readily apparent to a user.

“Post-prioritization” means action by a social media platform to place, feature, or prioritize certain content or material ahead of, below, or in a more or less prominent position than others in a newsfeed, a feed, a view, or in search results. The term does not include post-prioritization of content and material of a third party, including other users, based on payments by that third party, to the social media platform. 

Protections for political candidates

The first substantive part of the statute seeks to protect political candidates from being taken offline:

A social media platform may not willfully deplatform a candidate for office who is known by the social media platform to be a candidate, beginning on the date of qualification and ending on the date of the election or the date the candidate ceases to be a candidate. A social media platform must provide each user a method by which the user may be identified as a qualified candidate and which provides sufficient information to allow the social media platform to confirm the user’s qualification by reviewing the website of the Division of Elections or the website of the local supervisor of elections.

If the Florida Election Commission finds that a social media platform violates the above provision, it can fine the platform $250,000 per day for a candidate for statewide office, and $25,000 per day for a candidate for other offices. 

Social media platforms’ required activity

The statute spells out certain things that social media platforms must and must not do. For example, social media platforms:

  • Must publish the standards, including detailed definitions, it uses or has used for determining how to censor, deplatform, and shadow ban.
  • Must apply censorship, deplatforming, and shadow banning standards in a consistent manner among its users on the platform.
  • May not censor or shadow ban a user’s content or material or deplatform a user from the social media platform without notifying the user who posted or attempted to post the content or material (unless the content is obscene). (This notice must be in writing and must be delivered via electronic mail or direct electronic notification to the user within 7 days after the censoring action. It must include a thorough rationale explaining the reason that the social media platform censored the user. It must also include a precise and thorough explanation of how the social media platform became aware of the censored content or material, including a thorough explanation of the algorithms used, if any, to identify or flag the user’s content or material as objectionable.)
  • Must, if a user is deplatformed, allow that user to access or retrieve all of the user’s information, content, material, and data for at least 60 days after the user receives the required notice.
  • Must provide a mechanism that allows a user to request the number of other individual platform participants who were provided or shown the user’s content or posts and provide, upon request, a user with the number of other individual platform participants who were provided or shown content or posts.
  • Must categorize algorithms used for post-prioritization and shadow banning, and must allow a user to opt out of post-prioritization and shadow banning algorithm categories to allow sequential or chronological posts and content.
  • Must provide users with an annual notice on the use of algorithms for post-prioritization and shadow banning and reoffer annually the opt-out opportunity provided in the statute. 
  • May not apply or use post-prioritization or shadow banning algorithms for content and material posted by or about a user who is known by the social media platform to be a political candidate as defined under the law, beginning on the date of qualification and ending on the date of the election or the date the candidate ceases to be a candidate.
  • Must provide each user a method by which the user may be identified as a qualified candidate and which provides sufficient information to allow the social media platform to confirm the user’s qualification by reviewing the website of Florida’s Division of Elections or the website of the local supervisor of elections.
  • May not take any action to censor, deplatform, or shadow ban a journalistic enterprise based on the content of its publication or broadcast (unless the content is obscene as defined under Florida law). 

What happens if there is a violation?

A social media platform that violates the statute could face legal action from the government, or from private citizens who sue under the statute. 

Government action:

If the Florida Department of Justice, by its own inquiry or as a result of a complaint, suspects that a social media platform’s violation of the statute is imminent, occurring, or has occurred, it may investigate the suspected violation. Based on its investigation, the department may bring a civil or administrative action. The department can send subpoenas to learn about the algorithms related to any alleged violation. 

The ability for a private individual to bring an action under the statute is not as broad as the government’s ability to enforce the law. A private individual can only sue if:

  • the social media platform fails to apply censorship, deplatforming, and shadow banning standards in a consistent manner among its users on the platform, or
  • if the social media platform censors or shadow bans a user’s content or material or deplatforms the user from the social media platform without the required notice.

Remedies

The court may award the following remedies to the user who proves a violation of the statute:

  • Up to $100,000 in statutory damages per proven claim.
  • Actual damages.
  • If aggravating factors are present, punitive damages.
  • Other forms of equitable relief, including injunctive relief.
  • If the user was deplatformed in circumstances where the social media platform failed to apply censorship, deplatforming, and shadow banning standards in a consistent manner among its users on the platform, the user can recover its costs and reasonable attorney fees.

When does the law take effect?

July 1, 2021. 

 

Does Section 230 apply to claims under the Fair Credit Reporting Act?

Plaintiffs sued defendants, claiming that defendants violated the Fair Credit Reporting Act (FCRA) by including inaccurate criminal information on background check reports defendants produced and sold. Defendants moved for a judgment on the pleadings (a form of a motion to dismiss), arguing that 47 U.S.C. §230 provided immunity to defendants. Specifically, defendants argued that they were an interactive computer service, and that plaintiffs’ claims treated defendants as the publisher of third party content. The court agreed with defendants and granted defendants’ motion.

Section 230 Fair Credit Reporting Act

Defendants’ website

Defendants operate the website found at publicdata.com. The website allows customers to search through various databases available via the site. Defendants can pull this information into a report. Plaintiffs asserted that defendants purchase, collect, and assemble public record information into reports, which employers then buy from defendants via the website.

The FCRA claims

The FCRA places a number of requirements on “consumer reporting agencies,” and plaintiffs asserted that defendants did not meet these requirements. Each of the three plaintiffs – who wish to represent an entire class of plaintiffs – claim that reports obtained from prospective employers contained inaccurate information about criminal charges against plaintiffs, and resulted in plaintiffs not getting jobs they sought.

Section 230’s exceptions did not apply

The court began by noting that none of Section 230’s exceptions (i.e., situations where immunity does not apply) precluded immunity from applying to an FCRA claim. Congress enumerated five exceptions to immunity, expressly stating that Section 230 cannot have any effect on any “[f]ederal criminal statute,” “intellectual property law,” “[s]tate law that is consistent with this section,” “the Electronic Communications Privacy Act,” or “sex trafficking law.” Applying the canon of statutory construction of expressio unius est exclusio alterius, the court determined that where Congress explicitly enumerates certain exceptions to a general prohibition, additional exceptions are not to be implied, in the absence of evidence of a contrary legislative intent. The court held that since Congress plainly chose five exceptions to Section 230 immunity, and did not include the FCRA among them, by its plain language, Section 230 can apply to FCRA claims.

Immunity applied

Citing to the well-known Fourth Circuit case of Nemet Chevrolet, Ltd. v. Consumeraffairs.com, Inc., 591 F.3d 250, (4th Cir. 2009), the court looked to the three requirements to successfully assert § 230 immunity: (1) a defendant is an interactive computer service; (2) the content is created by an information content provider; and (3) the defendant is alleged to be the creator of the content. In this case, all three elements were met.

Finding that the defendants’ website was an interactive computer service, the court observed that Section 230 immunity covers information that the defendant does not create as an information content provider, and that such immunity is not lost when the interactive service provider pays a third party for the content at issue, and essentially becomes a “distributor” of the content.

On the second element, the court found that plaintiffs clearly stated that defendants did not create the content, but that they obtained it “from vendors, state agencies, and courthouses.” It was those entities that created the records defendants uploaded to their website and collected into reports.

And in the court’s mind there was no doubt that defendants did not create the content. It found that plaintiffs admitted in their complaint that the convictions and other information included on defendants’ reports were derived from other information content providers such as courts and other repositories of this information. Although plaintiffs alleged that defendants manipulated and sorted the content in a background check report, there was no explicit allegation that defendants materially contributed to or created the content themselves.

Henderson v. The Source For Public Data, 2021 WL 2003550 (E.D. Va. May 19, 2021)

Murdered Uber passenger’s mom can keep her case in court and out of arbitration

An Uber driver murdered plaintiff’s son. So plaintiff – the Uber user’s mom – sued Uber for wrongful death. The lower court threw out the case, saying that the Uber terms and conditions required the matter to go to arbitration. Plaintiff sought review with the Georgia Court of Appeals. On review, the court reversed and sent the case back to the lower court.

The appellate court found that it was improper to dismiss the case because it was not clear that plaintiff’s son – the one killed by the Uber driver – actually agreed to the Uber terms and conditions that contained the provision requiring arbitration.

First, there was a dispute as to whether he even saw the link to the terms and conditions when he signed up for Uber in 2016. That’s because he was using an Android phone, and plaintiff alleged the on-screen keyboard within the app may have covered up the link to the terms and conditions.

Second, the court noted that even though Uber submitted evidence it emailed updated terms and conditions to plaintiff’s son, and that he continued using Uber thereafter (thereby binding him to the terms), it was unclear that the email was ever sent to plaintiff’s son. If the customer never saw those terms, they would not apply, and therefore arbitration would not be proper.

Thornton v. Uber Technologies, Inc., 2021 WL 1960199 (Ct. App. Ga. May 17, 2021)

Section 230 did not protect Snapchat from negligence liability in car crash lawsuit over Speed Filter

The tragic facts

Landen Brown was using Snapchat’s Speed Filter in 2017 when the car in which he was riding with two other young men crashed after reaching speeds above 120 miles per hour. The Speed Filter documented how fast the car was traveling. The crash killed Landon and the other two occupants.

Section 230

The parents of two of the passengers sued Snap, Inc. (the purveyor of Snapchat), claiming that the app was negligently designed. The parents alleged, among other things, that Snap should have known that users believed they would be rewarded within the app for using the filter to record a speed above 100 miles per hour. The negligence claim was based in part on the notion that Snap did not remove or restrict access to Snapchat while traveling at dangerous speeds.

Immune, then not

The lower court dismissed the case, holding that 47 U.S.C. 230 protected Snapchat from liability. Plaintiffs sought review with the Ninth Circuit. On appeal, the court reversed, finding that Section 230 liability did not apply to the negligent design claim.

Section 230’s role in the case

Section 230 provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In this case, the court held that the parents’ complaint did not seek to hold Snap liable for its conduct as a publisher or speaker. The negligent design lawsuit treated Snap as a products manufacturer, accusing it of negligently designing a product (Snapchat) with a defect (the interplay between Snapchat’s reward system and the Speed Filter). Thus, the duty that Snap allegedly violated sprang from its distinct capacity as a product designer. Simply stated, in the court’s view, Snap’s alleged duty in this case case had nothing to do with its editing, monitoring, or removing of the content that its users generate through Snapchat.

Lemmon v. Snap, Inc. — F.3d —, 2021 WL 1743576 (9th Cir. May 4, 2021)

Scroll to top