Beauty and the Biometrics: Federal court in Illinois tosses biometric data case brought against cosmetics giant

biometric privacy

A federal judge recently dismissed a class action lawsuit against The Estée Lauder Companies and one of its affiliates. This case involved allegations that these entities violated the Illinois Biometric Information Privacy Act (BIPA).

Background of the Case

Plaintiffs represented a proposed class and accused defendants of three distinct violations of BIPA. The dispute centered on the use of a virtual try-on tool that one of defendants had licensed to Estée Lauder which enabled customers to virtually test cosmetic products on brand websites. Plaintiffs claimed that they were not adequately informed about the capture and use of their biometric data, including facial mapping and facial geometry. They argued that there was a failure to provide clear consent and privacy policies regarding biometric data.

What BIPA Says

The law governs private entities’ collection, use, and storage of biometric identifiers and information. Plaintiffs contended that defendants did not comply with these requirements, specifically in failing to obtain written consent and establishing proper retention and destruction policies for biometric data.

What the Court Said

The court’s decision to dismiss the case hinged on plaintiffs’ inability to demonstrate that defendants used the biometric data in a manner that could identify individuals. The court referenced similar cases where allegations were dismissed due to the lack of plausible claims connecting biometric data collection with the capability to identify individuals.

The court found that plaintiffs did not provide sufficient factual allegations to establish that defendants could identify individuals using the facial scans. It compared other cases where claims were either dismissed or upheld based on the presence or absence of plausible allegations of identification capability. The case was dismissed without prejudice, meaning plaintiffs were given the opportunity to file an amended complaint by a specified date.

What It Means

This decision highlights the importance of clear legal standards for biometric data usage and the challenges plaintiffs face in proving violations under BIPA. It also underscores the need for companies to be transparent and compliant with privacy laws when implementing innovative technologies.

Castelaz v. The Estee Lauder Companies, Inc. et al., 2024 WL 136872 (N.D. Illinois, January 10, 2024)

See also:

Biometric privacy statute does not violate First Amendment

biometric privacy First Amendment
Biometric identifiers extracted from a photo are not public in the same way the photo itself is

 

Plaintiffs filed a class action lawsuit against a facial recognition technology company and related individual defendants, asserting violations of the Illinois Biometric Information Privacy Act (“BIPA”). Plaintiffs alleged that defendants covertly scraped over three billion photographs of faces from the internet and then used artificial intelligence algorithms to scan the face geometry of each individual depicted to harvest the individuals’ unique biometric identifiers and corresponding biometric information. One of the defendants then created a searchable database containing this biometric information and data that enabled users of its proprietary platform to identify unknown individuals by uploading a photograph to the database. Accordingly, plaintiffs alleged that defendants collected, captured, or otherwise obtained their biometric data without notice and consent, and thereafter, sold or otherwise profited from their biometric information, all in violation of BIPA.

Unconstitutional restriction on public information?

Defendants moved to dismiss the BIPA claim on a number of grounds, including an argument that BIPA violated defendants’ First Amendment rights. More specifically, defendants maintained that the capture and analysis of faceprints from public images was protected speech, and thus, BIPA was unconstitutional because it inhibited the ability to collect and analyze public information. Plaintiffs, however, asserted that the capturing of faceprints and the action of extracting private biometric identifiers from the faceprints was unprotected conduct. The court sided with plaintiffs and rejected defendants’ argument.

The court held that defendants’ argument oversimplified plaintiffs’ allegations. Although defendants captured public photographs from the internet, they then harvested an individual’s unique biometric identifiers and information – which are not public information – without the individual’s consent. Put differently, plaintiffs asserted that the defendants’ business model was not based on the collection of public photographs from the internet, some source code, and republishing information via a search engine, but the additional conduct of harvesting nonpublic, personal biometric data. And, as plaintiffs further alleged, unlike fingerprints, facial biometrics are readily observable and present a grave and immediate danger to privacy, individual autonomy, and liberty.

An intermediate approach to biometric privacy

Accordingly, the court looked at defendants’ conduct as involving both speech and nonspeech elements. Looking to the test set out in the Supreme Court case of United States v. O’Brien, 391 U.S. 367 (1968), the court evaluated how when “elements are combined in the same course of conduct, a sufficiently important governmental interest in regulating the nonspeech element can justify incidental limitations on First Amendment freedoms.” The court applied the intermediate scrutiny standard set out in O’Brien, namely, a regulation does not violate the First Amendment if (1) it is within the power of the government to enact, (2) furthers an important government interest, (3) the governmental interest is unrelated to the suppression of free expression, and (4) any incidental restriction on speech is no greater than is necessary to further the government interest.

The first element was easy to dispense with because the parties did not argue that the Illinois General Assembly lacked the power to enact BIPA. On the second element, the court found that the General Assembly enacted BIPA to protect Illinois residents’ highly sensitive biometric information from unauthorized collection and disclosure. Regarding the third element, the court noted that BIPA, including its exceptions, does not restrict a particular viewpoint, nor does it target public discussion of an entire topic. And on the fourth O’Brien element, the court found BIPA to be narrowly tailored by legitimately protecting Illinois residents’ highly sensitive biometric information and data, yet allowing residents to share their biometric information through its consent provision. And BIPA is not overly-broad, in the court’s view, because it does not prohibit a substantial amount of protected speech.

In re Clearview AI, Inc., Consumer Privacy Litigation, 2022 WL 444135 (N.D. Illinois, February 14, 2022)

Scroll to top