Biometric privacy statute does not violate First Amendment

biometric privacy First Amendment
Biometric identifiers extracted from a photo are not public in the same way the photo itself is

 

Plaintiffs filed a class action lawsuit against a facial recognition technology company and related individual defendants, asserting violations of the Illinois Biometric Information Privacy Act (“BIPA”). Plaintiffs alleged that defendants covertly scraped over three billion photographs of faces from the internet and then used artificial intelligence algorithms to scan the face geometry of each individual depicted to harvest the individuals’ unique biometric identifiers and corresponding biometric information. One of the defendants then created a searchable database containing this biometric information and data that enabled users of its proprietary platform to identify unknown individuals by uploading a photograph to the database. Accordingly, plaintiffs alleged that defendants collected, captured, or otherwise obtained their biometric data without notice and consent, and thereafter, sold or otherwise profited from their biometric information, all in violation of BIPA.

Unconstitutional restriction on public information?

Defendants moved to dismiss the BIPA claim on a number of grounds, including an argument that BIPA violated defendants’ First Amendment rights. More specifically, defendants maintained that the capture and analysis of faceprints from public images was protected speech, and thus, BIPA was unconstitutional because it inhibited the ability to collect and analyze public information. Plaintiffs, however, asserted that the capturing of faceprints and the action of extracting private biometric identifiers from the faceprints was unprotected conduct. The court sided with plaintiffs and rejected defendants’ argument.

The court held that defendants’ argument oversimplified plaintiffs’ allegations. Although defendants captured public photographs from the internet, they then harvested an individual’s unique biometric identifiers and information – which are not public information – without the individual’s consent. Put differently, plaintiffs asserted that the defendants’ business model was not based on the collection of public photographs from the internet, some source code, and republishing information via a search engine, but the additional conduct of harvesting nonpublic, personal biometric data. And, as plaintiffs further alleged, unlike fingerprints, facial biometrics are readily observable and present a grave and immediate danger to privacy, individual autonomy, and liberty.

An intermediate approach to biometric privacy

Accordingly, the court looked at defendants’ conduct as involving both speech and nonspeech elements. Looking to the test set out in the Supreme Court case of United States v. O’Brien, 391 U.S. 367 (1968), the court evaluated how when “elements are combined in the same course of conduct, a sufficiently important governmental interest in regulating the nonspeech element can justify incidental limitations on First Amendment freedoms.” The court applied the intermediate scrutiny standard set out in O’Brien, namely, a regulation does not violate the First Amendment if (1) it is within the power of the government to enact, (2) furthers an important government interest, (3) the governmental interest is unrelated to the suppression of free expression, and (4) any incidental restriction on speech is no greater than is necessary to further the government interest.

The first element was easy to dispense with because the parties did not argue that the Illinois General Assembly lacked the power to enact BIPA. On the second element, the court found that the General Assembly enacted BIPA to protect Illinois residents’ highly sensitive biometric information from unauthorized collection and disclosure. Regarding the third element, the court noted that BIPA, including its exceptions, does not restrict a particular viewpoint, nor does it target public discussion of an entire topic. And on the fourth O’Brien element, the court found BIPA to be narrowly tailored by legitimately protecting Illinois residents’ highly sensitive biometric information and data, yet allowing residents to share their biometric information through its consent provision. And BIPA is not overly-broad, in the court’s view, because it does not prohibit a substantial amount of protected speech.

In re Clearview AI, Inc., Consumer Privacy Litigation, 2022 WL 444135 (N.D. Illinois, February 14, 2022)

Is Indiana’s revenge porn law constitutional?

revenge porn constitutional
Stained glass window at Pokagon State Park in Angola, Indiana, near where the underlying events in this case took place.

 

In 2019, Indiana joined a number of other states and enacted a statute that makes it a crime for a person to distribute an “intimate image” when he or she knows or reasonably should know that an individual depicted in the image does not consent to the distribution. In March 2020, defendant sent a video of himself receiving oral sex to his ex-girlfriend via Snapchat. After being charged under the statute, defendant moved to dismiss, arguing in part that the statute violates both the Indiana and U.S. constitutions. The trial court agreed and dismissed the case. But the state appealed to the Indiana Supreme Court.

What part of the Indiana constitution applied?

The court’s analysis under the Indiana constitution is particularly interesting. Indiana’s constitutional protection in this area reads quite a bit differently than the language of the First Amendment.

Article 1, Section 9 of the Indiana constitution reads as follows:

No law shall be passed, restraining the free interchange of thought and opinion, or restricting the right to speak, write, or print, freely, on any subject whatever: but for the abuse of that right, every person shall be responsible.

The court first had to evaluate whether videos – and in particular the video at issue – were covered by the applicable Indiana constitutional provision. “Our encounters with Article 1, Section 9 have always involved words, thus invoking the ‘right to speak’ clause.” The court held that the video content was protected under the “free interchange” clause of the state’s constitution. “We understand the free interchange clause to encompass the communication of any thought or opinion, on any topic, through ‘every conceivable mode of expression.’” And the court quickly ascertained that being prosecuted for the distribution of the video was a “direct and substantial burden” on defendant’s right to self-expression.

Abuse of rights?

But defendant’s expressive activity in this case – though within his right to free interchange as expressed in the constitution – was an abuse of that right. Looking through the lens of the natural rights philosophy that informed the drafting of the Indiana constitution, the court cited to previous authority (Whittington v. State, 669 N.E.2d 1363 (Ind. 1996)) that explained how “individuals possess ‘inalienable’ freedom to do as they will, but they have collectively delegated to government a quantum of that freedom in order to advance everyone’s ‘peace, safety, and well-being.'” Thus, the court observed that the purpose of state power is “to foster an atmosphere in which individuals can fully enjoy that measure of freedom they have not delegated to government.”

Citing to State v. Gerhardt, 145 Ind. 439 (Ind. 1896), the court evaluated how “[t]he State may exercise its police power to promote the health, safety, comfort, morals, and welfare of the public.” And citing to other authority, the court noted that “courts defer to legislative decisions about when to exercise the police power and typically require only that they be rational.” So the question became whether – approached from the standpoint of rationality – the statute’s restriction on the right to self-expression was appropriate to promote the health, safety, comfort, morals and welfare of the public.

Rationality favored public protection

“Under our rationality inquiry, we have no trouble concluding the impingement created by the statute is vastly outweighed by the public health, welfare, and safety served.” In reaching this conclusion, the court examined, among other things, the tremendous harms of revenge porn – including its connection to domestic violence and psychological injury. Accordingly, the court found the statute did not violate the Indiana constitution.

The court also found that the statute did not violate the First Amendment of the U.S. Constitution. It held that the statute is content-based and therefore subject to strict scrutiny. Even under this standard, the court found that it served a compelling government interest, and was narrowly tailored to achieve that compelling interest.

State v. Katz, 2022 WL 152487 (Ind., January 18, 2022)

See also:

Can a party recover statutory damages under the Stored Communications Act without proving actual damages?

The Stored Communications Act (18 USC 2701 et seq.) is among the most powerful tools relating to email privacy. It is a federal statute that prohibits, in certain circumstances, one from intentionally accessing without authorization, or exceeding authorized access to, a facility through which an electronic communication service is provided. The statute provides criminal penalties and an aggrieved party can bring a civil suit for damages in certain cases.

The statute contains a provision that addresses the amount of money damages a successful plaintiff can recover. Section 2707(c) provides the following:

Damages. The court may assess as damages in a civil action under this section the sum of the actual damages suffered by the plaintiff and any profits made by the violator as a result of the violation, but in no case shall a person entitled to recover receive less than the sum of $1,000. If the violation is willful or intentional, the court may assess punitive damages. In the case of a successful action to enforce liability under this section, the court may assess the costs of the action, together with reasonable attorney fees determined by the court.

Note the phrase “but in no case shall a person entitled to recover receive less than the sum of $1,000.” Does that mean every plaintiff that successfully proves the defendant’s liability is entitled to at least $1,000, regardless of whether there was any actual damage that occurred? The Fifth Circuit Court of Appeals recently addressed that question in the case of Domain Protection, L.L.C. v. Sea Wasp, L.L.C. It held that one must show at least some actual damages before being entitled to the minimum of $1,000.

The court looked to the Supreme Court’s approach in addressing nearly identical language in another statute, wherein SCOTUS concluded that “person entitled to recover” refers back to the party that suffers “actual damages.” Doe v. Chao, 540 U.S. 614, 620, 124 S.Ct. 1204, 157 L.Ed.2d 1122 (2004). And it noted that two other circuits have held that this reasoning should apply to the same terms in the Stored Communications Act: Vista Mktg., LLC v. Burkett, 812 F.3d 954, 964–75 (11th Cir. 2016) and Van Alstyne v. Elec. Scriptorium, Ltd., 560 F.3d 199, 204–208 (4th Cir. 2009). The court “endorse[d] the reasoning of those opinions and [saw] no need to repeat it.”

Domain Protection, L.L.C. v. Sea Wasp, L.L.C., — F.4th —, 2022 WL 123408 (5th Cir. January 13, 2022)

How Portland has not demonstrated long-term commitment to a ban on facial recognition technologies

facial recognition ban

Portland, Oregon yesterday passed a ban on facial recognition technology. Officials cited two primary reasons for the ban. First, current facial recognition technologies less accurately identify people who are not young, white and/or male. Second, everyone should have some sense of anonymity and privacy when in public places.

Should the facial recognition ban focus on disparate impact?

Do Portland’s efforts to “improve people’s lives, with a specific focus on communities of color and communities with disabilities” demonstrate an effective long-term commitment to keeping invasive facial recognition technology at bay? Such a focus implies that when facial recognition technologies get better and less biased, they should then be deployed full scale, because then everyone will be harmed equally.

That’s one of the problems with looking to ban a technology based on its nascent state and accompanying imperfect implementation. Given the choice between arguing (1) that a technology is being harmfully implemented now, and (2) that the technology, no matter how perfect it is, infringes some fundamental human right, I’d go with number (2) every time.

We will find ourselves halfway down the slippery slope

We know the accuracy of this technology will increase with the development of better cameras, smarter algorithms and more data. When that happens, if you are still seeking to argue against its harmful effects on fundamental rights such as anonymity and privacy, you will already have slid halfway down the slope. With your previous “best” argument made moot, your argument now – an appeal to fundamental rights – will have less impact.

So maybe we should focus on the real issues – the fundamental right of anonymity and privacy for everyone – rather than leading with a social justice argument. At some later point, having made it the primary argument and it having becoming moot, the rationale will be a liability.

About the author

Evan Brown is a technology and intellectual property attorney in Chicago. Follow him on Twitter at @internetcases. Phone: (630) 362-7237. Email: ebrown@internetcases.com.

See also

Police not required to publicly disclose how they monitor social media accounts in investigations

Should revenge porn victims be allowed to proceed anonymously in court?

Plaintiff and her twin sister sued her ex-boyfriend and an unknown John Doe accusing them of copyright infringement and other torts such as invasion of privacy. They claimed the defendants posted intimate and nude photos of plaintiffs online without their consent. And defendants had posted one of the plaintiff’s name and other information on social media in connection with the photos.

Arguing that they had a substantial privacy right that outweighed the customary and constitutionally-embedded presumption of openness in judicial proceedings, plaintiffs asked the court for permission to proceed anonymously. But the court denied the motion.

Plaintiffs’ privacy arguments

Plaintiffs had primarily argued that proceeding under their real names would require them to disclose information of the utmost intimacy and that if they were required to attach their names to the litigation, there would be a public record connecting their names to the harm and exploitation they had suffered which could result in even more people viewing the very images that were stolen and disseminated without their consent.

Court: the harm had already been done

The court rejected these arguments. It observed that the photographs had been published on the internet for approximately seven years and had been sent to people they know. Plaintiffs admitted that one of them could be identified in some of the photographs because her face and a distinctive tattoo were visible. And John Doe had already published that plaintiff’s contact information which resulted in her being inundated with phone calls, text messages, emails, and Instagram, Facebook, and Twitter messages.

So in the court’s mind it appeared that that plaintiff’s identity was already known or discoverable. In addition, that plaintiff had obtained copyright registrations for many of the photographs and the copyright registration was a public document that clearly identified her by name.

As for the twin sister, although her identity had not been similarly made public, the court found that “no great stretch [was] required to identify her through public records as [the other plaintiff’s] twin sister.”

Consequently, the court was not persuaded that plaintiffs’ privacy interests outweighed the public’s right of access in judicial proceedings.

M.C. v. Geiger, 2018 WL 6503582 (M.D.Fla. Dec. 11, 2018)

Facebook did not violate HIPAA by using data showing users browsed healthcare-related websites

Plaintiffs sued Facebook and other entities, including the American Cancer Society, alleging that Facebook violated numerous federal and state laws by collecting and using plaintiffs’ browsing data from various healthcare-related websites. The district court dismissed the action and plaintiffs sought review with the Ninth Circuit. On appeal, the court affirmed the dismissal.

The appellate court held that the district court properly determined that plaintiffs consented to Facebook’s data tracking and collection practices.

Plaintiffs consented to Facebook’s terms

It noted that in determining consent, courts consider whether the circumstances, considered as a whole, demonstrate that a reasonable person understood that an action would be carried out so that their acquiescence demonstrates knowing authorization.

In this case, plaintiffs did not dispute that their acceptance of Facebook’s Terms and Policies constituted a valid contract. Those Terms and Policies contained numerous disclosures related to information collection on third-party websites, including:

  • “We collect information when you visit or use third-party websites and apps that use our Services …. This includes information about the websites and apps you visit, your use of our Services on those websites and apps, as well as information the developer or publisher of the app or website provides to you or us,” and
  • “[W]e use all of the information we have about you to show you relevant ads.”

The court found that a reasonable person viewing those disclosures would understand that Facebook maintained the practices of (a) collecting its users’ data from third-party sites and (b) later using the data for advertising purposes. This was “knowing authorization”.

“But it’s health-related data”

The court rejected plaintiffs claim that—though they gave general consent to Facebook’s data tracking and collection practices—they did not consent to the collection of health-related data due to its “qualitatively different” and “sensitive” nature.

The court did not agree that the collected data was so different or sensitive. The data showed only that plaintiffs searched and viewed publicly available health information that could not, in and of itself, reveal details of an individual’s health status or medical history.

This notion supported the court’s conclusion that the use of the information did not violate the Health Information Portability and Accountability Act of 1996 (“HIPAA”) and its California counterpart.

The court held that information available on publicly accessible websites stands in stark contrast to the personally identifiable patient records and medical histories protected by HIPAA and other statutes — information that unequivocally provides a window into an individual’s personal medical history.

Smith v. Facebook, Inc., 2018 WL 6432974 (9th Cir. Dec. 6, 2018)

Court labels copyright plaintiff as a troll and shuts down efforts to ID anonymous infringer

When a copyright plaintiff does not know who a particular alleged infringer is, it must first send a subpoena to the ISP assigned the IP address used to commit the alleged infringement. But the rules of procedure do not allow the sending of subpoenas until after the 26(f) conference – a meeting between the plaintiff and defendant (or their lawyers) to discuss the case. A plaintiff cannot have a 26(f) conference if the defendant has not been served with the complaint, and the complaint cannot be served unless the defendant’s identity is known.

So you can see the conundrum. To break out of this not-knowing, plaintiffs in situations like this will ask the court’s help through a motion for leave to take early discovery. That way the plaintiff can learn who the defendant is, serve the complaint, and move the case forward.

In the recent case of Strike 3 Holdings v. Doe, Judge Royce Lamberth of the U.S. District Court for the District of Columbia put a stop to the efforts of a plaintiff that it called a copyright troll right to its face (or at least right in the text of the opinion). The court denied Strike 3’s motion for leave to take early discovery to learn the identity of an unknown BitTorrent user accused of downloading pornography.

The court held that the plaintiff’s request was not specific enough, and the privacy interests of the unknown defendant, together with the social harm of being wrongfully accused of obtaining “particularly prurient pornography” were not outweighed by the trollish plaintiff’s need for the information.

Key to the court’s ruling was the idea that a subpoena in circumstances like this must be able to actually identify a defendant who could be sued. The court noted, however, that

Strike 3 could not withstand a 12(b)(6) motion in this case without resorting to far more intensive discovery machinations sufficiently establishing defendant did the infringing—examining physical evidence (at least the computers, smartphones, and tablets of anyone in the owner’s house, as well as any neighbor or houseguest who shared the Internet), and perhaps even interrogatories, document requests, or depositions. Strike 3’s requested subpoena thus will not—and may never—identify a defendant who could be sued.

The opinion is an entertaining read and conveys the judge’s clear frustration with copyright troll plaintiffs. Below are some of the more memorable quips.

Regarding the flaws of using IP addresses to identify people:

[Plaintiff’s] method [of identifying infringers] is famously flawed: virtual private networks and onion routing spoof IP addresses (for good and ill); routers and other devices are unsecured; malware cracks passwords and opens backdoors; multiple people (family, roommates, guests, neighbors, etc.) share the same IP address; a geolocation service might randomly assign addresses to some general location if it cannot more specifically identify another.

Regarding the public shame of being accused of infringing porn:

… But in many cases, the method is enough to force the Internet service provider (ISP) to unmask the IP address’s subscriber. And once the ISP outs the subscriber, permitting them to be served as the defendant, any future Google search of their name will turn-up associations with the websites Vixen, Blacked, Tushy, and Blacked Raw. The first two are awkward enough, but the latter two cater to even more singular tastes.

How trolls are quick to flee:

Indeed, the copyright troll’s success rate comes not from the Copyright Act, but from the law of large numbers. … These serial litigants drop cases at the first sign of resistance, preying on low-hanging fruit and staying one step ahead of any coordinated defense. They don’t seem to care about whether defendant actually did the infringing, or about developing the law. If a Billy Goat Gruff moves to confront a copyright troll in court, the troll cuts and runs back under its bridge. Perhaps the trolls fear a court disrupting their rinse-wash-and-repeat approach: file a deluge of complaints; ask the court to compel disclosure of the account holders; settle as many claims as possible; abandon the rest.

It’s pretty much extortion:

Armed with hundreds of cut-and-pasted complaints and boilerplate discovery motions, Strike 3 floods this courthouse (and others around the country) with lawsuits smacking of extortion. It treats this Court not as a citadel of justice, but as an ATM. Its feigned desire for legal process masks what it really seeks: for the Court to oversee a high-tech shakedown. This Court declines.

The court’s decision to deny discovery is anything but the rubber stamp approach so many judges in these kinds of cases over the past several years have been accused of employing.

Strike 3 Holdings v. Doe, 2018 WL 6027046 (D.D.C. November 16, 2018)

Court takes into consideration defendant’s privacy in BitTorrent copyright infringement case

Frequent copyright plaintiff Strike 3 Holdings filed a motion with the U.S. District Court for the District of Minnesota seeking an order allowing Strike 3 to send a subpoena to Comcast, to learn who owns the account used to allegedly infringe copyright. The Federal Rules of Civil Procedure created a bootstrapping problem or, as the court called it, a Catch-22, for Strike 3 – it was not able to confer with the unknown Doe defendant as required by Rule 26(f) because it could not identify the defendant, but it could not identify defendant without discovery from Comcast.

The court granted Strike 3’s motion for leave to take early discovery, finding that good cause existed for granting the request, and noting:

  • Strike 3 had stated an actionable claim for copyright infringement,
  • The discovery request was specific,
  • There were no alternative means to ascertain defendant’s name and address,
  • Strike 3 had to know defendant’s name and address in order to serve the summons and complaint, and
  • Defendant’s expectation of privacy in his or her name and address was outweighed by Strike 3’s right to use the judicial process to pursue a plausible claim of copyright infringement.

On the last point, the court observed that the privacy interest was outweighed especially given that the court could craft a limited protective order under Federal Rule of Civil Procedure 26(c) to protect an innocent ISP subscriber and to account for the sensitive and personal nature of the subject matter of the lawsuit.

Strike 3 Holdings, LLC v. Doe, 2018 WL 2278110 (D.Minn. May 18, 2018)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

No privacy violation for disclosing information otherwise available on member-only website

Plaintiff sued several defendants related to her past work as a government employee. She sought to amend her pleadings to add claims for violation of the Fourth Amendment and the federal Stored Communications Act. She claimed that defendants wrongfully disclosed private medical information about her. The court denied her motion to amend the pleadings to add the Fourth Amendment and Stored Communications Act claims because such amendments would have been futile.

Specifically, the court found there to be no violation because she had no reasonable expectation of privacy in the information allegedly disclosed. She had made that information available on a website. Though to view the information required signing up for an account, plaintiff had not set up the website to make the information available only to those she invited to view it. The court relied on several cases from earlier in the decade that addressed the issue of privacy of social media content, among them Rosario v. Clark Cty. Sch. Dist., 2013 WL 3679375 (D. Nev. July 3, 2013), which held that one has no reasonable expectation of privacy in his or her tweets, even if he or she had maintained a private account. In that case, the court held that even if the social media user maintained a private account, his tweets still amounted to the dissemination of information to the public.

Burke v. New Mexico, 2018 WL 2134030 (D.N.M. May 9, 2018)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Puzzling privacy analysis in decision to unmask anonymous accused copyright infringers

Plaintiff porn company sued an unknown bittorrent user (identified as John Doe) alleging that defendant had downloaded and distributed more than 20 of plaintiff’s films. Plaintiff asked the court for leave to serve a subpoena on Optimum Online – the ISP associated with defendant’s IP address – prior to the Rule 26(f) conference. (As we have recently discussed, leave of court is required to start discovery before the Rule 26(f) conference, but a plaintiff cannot have that conference unless it knows who the defendant is.) Plaintiff already knew defendant’s IP address. It needed to serve the subpoena on the ISP to learn defendant’s real name and physical address so it could serve him with the complaint.

The court went through a well-established test to determine that good cause existed for allowing the expedited discovery. Drawing heavily on the case of Sony Music Entm’t, Inc. v. Does 1-40, 326 F. Supp. 2d 556 (S.D.N.Y. 2004), the court evaluated:

(1) the concreteness of the plaintiff’s showing of a prima facie claim of copyright infringement,

(2) the specificity of the discovery request,

(3) the absence of alternative means to obtain the subpoenaed information,

(4) the need for the subpoenaed information to advance the claim, and

(5) the objecting party’s expectation of privacy.

The court’s conclusions were not surprising on any of these elements. But it’s discussion under the fifth point, namely, the defendant’s expectation of privacy, was puzzling, and the court may have missed an important point.

It looked to the recent case involving Dred Pirate Roberts and Silk Road, namely, United States v. Ulbricht, 858 F.3d 71 (2d Cir. 2017). Leaning on the Ulbricht case, the court concluded that defendant had no reasonable expectation of privacy in the sought-after information (name and physical address) because there is no expectation of privacy in “subscriber information provided to an internet provider,” such as an IP address, and such information has been “voluntarily conveyed to third parties.”

While the court does not misquote the Ulbricht case, one is left to wonder why it would use that case to support discovery of the unknown subscriber’s name and physical address. At issue in Ulbricht was whether the government violated Dred Pirate Roberts’s Fourth Amendment rights when it obtained the IP address he was using. In this case, however, the plaintiff already knew the IP address from its forensic investigations. The sought-after information here was the name and physical address, not the IP address he used.

So looking to Ulbricht to say that the Doe defendant had no expectation of privacy in his IP address does nothing to shed information on the kind of expectation of privacy, if any, he should have had on his real name and physical address.

The court’s decision ultimately is not incorrect, but it did not need to consult with Ulbricht. As in the Sony Music case from which it drew the 5-part analysis, and in many other similar expedited discovery cases, the court could have simply found there was no reasonable expectation of privacy in the sought-after information, because the ISP’s terms of service put the subscriber on notice that it will turn over the information to third parties in certain circumstances like the ones arising in this case.

Strike 3 Holdings, LLC v. Doe, 2017 WL 5001474 (D.Conn., November 1, 2017)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Scroll to top