Clearview AI scraped 60 billion facial images from the internet, built a facial recognition database, and sold access to law enforcement. The resulting class action settled for $51.75 million, with class members receiving a 23% equity stake in the company because Clearview could not afford to pay cash. That case involved mass surveillance at an extraordinary scale. But the statute it was brought under, the Illinois Biometric Information Privacy Act (BIPA), applies to any company that collects, stores, or uses biometric identifiers, including the SaaS startup whose product processes a single fingerprint scan or voiceprint.
Every other case in this series involves data that can be changed. You can get a new credit card number. You can change your address. You can update your password. Biometric identifiers cannot be changed. Your fingerprint, your facial geometry, your voiceprint, your iris pattern are permanent. If they are compromised, they are compromised forever. That is the premise underlying BIPA, and it is the reason biometric data carries more legal risk per violation than any other category of personal information a SaaS product might process.
What BIPA Requires
BIPA was enacted in Illinois in 2008, making it one of the earliest biometric privacy statutes in the United States. It regulates the collection, storage, use, and dissemination of biometric identifiers and biometric information. Biometric identifiers under BIPA include fingerprints, voiceprints, facial geometry, iris scans, and retina scans.
Before collecting a biometric identifier, a company must provide the individual with a written notice describing the specific purpose and duration of the collection. The company must obtain a written release from the individual. The company must publish a publicly available written policy establishing a retention schedule and guidelines for permanent destruction of the data.
BIPA also prohibits selling, leasing, trading, or otherwise profiting from biometric identifiers. And it prohibits disclosing biometric identifiers to third parties without the individual’s consent.
The enforcement mechanism is what makes BIPA uniquely dangerous. Unlike most privacy statutes, BIPA includes a private right of action. Any person aggrieved by a violation can sue. Statutory damages are $1,000 per negligent violation and $5,000 per intentional or reckless violation. No proof of actual harm is required. The Illinois Supreme Court confirmed in Rosenbach v. Six Flags (2019) that a violation of BIPA’s notice and consent requirements is itself a concrete injury sufficient to confer standing.
In 2024, the Illinois legislature passed SB 2979 amending BIPA to limit damages to one violation per person per method of collection, regardless of how many times the biometric data was captured. This was a response to the Illinois Supreme Court’s decision in Cothron v. White Castle, which had allowed separate damages for each instance of collection. The amendment reduced exposure for defendants with repeated scans (like time clock systems), but the per-violation statutory damages for the initial collection remain intact.
The Clearview Settlement
Clearview AI built a database of more than 60 billion facial images by scraping publicly available photographs from social media platforms, news websites, and other online sources. Its facial recognition software enabled law enforcement agencies and private companies to upload a photo and receive potential identity matches. At no point did Clearview obtain consent from the individuals whose images it collected.
In 2020, following a New York Times investigation that brought Clearview’s practices to public attention, consumers filed multiple lawsuits. Eleven actions were consolidated into a multidistrict litigation in the Northern District of Illinois. The complaint alleged violations of BIPA and privacy statutes from California, New York, and Virginia, with seven of the sixteen causes of action arising under BIPA.
In June 2024, the parties reached a settlement. On March 20, 2025, Judge Sharon Johnson Coleman granted final approval. The terms are unusual. Clearview lacked the liquid assets to fund a traditional cash settlement. Instead, class members received a collective 23% equity stake in the company, valued at approximately $51.75 million based on a company valuation of $225 million. Class members also have the right to claim 17% of Clearview’s revenues from the following two years, or the class can sell its equity stake. Approximately $20 million of the settlement value is allocated to attorneys’ fees.
The settlement drew significant criticism. Twenty-two state attorneys general and the District of Columbia opposed the deal. Clearview is not required to delete the biometric data it collected, stop scraping new data, or allow individuals to opt out of future collection (beyond an existing opt-out mechanism established in a separate 2022 settlement with the ACLU, which applies only in Illinois). Privacy advocates argued the settlement provides no meaningful deterrent.
For SaaS founders, the Clearview settlement is relevant not because of its scale (most startups will never scrape billions of images) but because of the statute it was brought under. BIPA applies with equal force to a SaaS company that processes a single employee’s fingerprint without proper notice and consent.
The Numbers That Should Get Your Attention
BIPA litigation is not slowing down. In 2025 alone, over 107 new BIPA class action lawsuits were filed in Illinois. More than 1,500 BIPA cases have been filed since 2018. The largest settlements include:
Facebook (now Meta): $650 million (2020) for collecting facial geometry from photo tagging without consent. Meta/Texas: $1.4 billion (2024) in a separate Texas state action under the Texas Capture or Use of Biometric Identifier Act (CUBI) for unauthorized biometric data collection through Facebook and Instagram. Google/Texas: $1.375 billion (2025) under CUBI for collecting voiceprints and facial geometry through Google Photos, Google Assistant, and Nest Hub Max without consent. Clearview AI: $51.75 million (2025) in the equity-based settlement described above.
Beyond the headline settlements, BIPA litigation hits companies at every scale. Accu-Time settled for $1.5 million over fingerprint time clock systems. Speedway settled for $12.1 million. The typical BIPA defendant is not a surveillance company. It is an employer that deployed a fingerprint time clock without obtaining written consent, or a SaaS vendor whose product collected biometric data without the required disclosures.
BIPA has also become a staple of mass arbitration. Plaintiffs’ firms recruit claimants online, file thousands of individual arbitration demands against a single company, and leverage the accumulated arbitration fees to force settlement. This trend accelerated through 2025 and is expected to continue into 2026.
Why This Matters for SaaS Founders
If your product collects, processes, stores, or transmits biometric identifiers, BIPA applies to you if any of your users or their end users are in Illinois. The statute does not have a geographic exemption for companies headquartered elsewhere. If an Illinois resident’s biometric data enters your system, you are subject to BIPA.
The categories of SaaS products that touch biometric data are broader than most founders realize.
Facial recognition and identity verification. If your product verifies user identity through facial matching, you are collecting facial geometry. This includes login authentication, KYC (know your customer) flows, and age verification systems.
Voice authentication and voice AI. A voiceprint is a mathematical representation of the unique characteristics of an individual’s voice: pitch, tone, cadence, frequency patterns, and other acoustic properties that distinguish one speaker from another. It is the audio equivalent of a fingerprint. A voiceprint is not the same as a voice recording. A recording captures what someone said. A voiceprint captures how they said it, extracting the physiological and behavioral characteristics that make their voice identifiable.
The distinction matters because not every product that processes audio creates a voiceprint. If your product records a customer service call and stores the audio file, that is a voice recording. If your product analyzes voice characteristics to authenticate a user’s identity, detect speaker changes in a conversation, or match a speaker across multiple interactions, it is extracting voiceprint data, and that is a biometric identifier under BIPA.
The practical line for SaaS founders: if your product processes voice data and the output could be used to identify a specific individual by their voice (as opposed to transcribing what they said), you are likely creating or processing a biometric identifier. This includes speaker verification systems, voice-based authentication, AI models that learn to recognize individual speakers, and any feature that distinguishes between speakers in a multi-party conversation based on vocal characteristics. This is directly relevant to the AI voice assistant products discussed earlier in this series. If your product processes voice calls and retains voiceprint data, you have a BIPA obligation on top of the CIPA exposure covered in the first post.
Fingerprint and biometric access systems. If your product integrates with fingerprint scanners, palm readers, or other biometric access devices (common in workforce management, physical security, and attendance tracking), you are processing biometric identifiers.
AI training on biometric data. If your AI model is trained on data that includes facial images, voice recordings, or other biometric inputs, the collection and use of that training data is subject to BIPA. This is the Clearview pattern applied to your product: using biometric data for purposes (model training) that the individual did not consent to.
What Your Legal Stack Needs to Address
Determine whether your product processes biometric identifiers. This is the threshold question. If it does, every subsequent obligation flows from it. Review your product’s data flows and identify whether facial geometry, voiceprints, fingerprints, iris scans, or retina scans enter your system at any point, whether from your users, their end users, or their employees.
Written notice and written consent before collection. BIPA requires both. The notice must describe the specific purpose of the collection and the length of time the data will be stored. The consent must be a written release. Clickwrap agreements can satisfy the written release requirement if properly structured, but the notice must be specific to biometric data, not buried in a general privacy policy.
Publish a retention and destruction policy. BIPA requires a publicly available written policy establishing a retention schedule and guidelines for permanently destroying biometric data. Most SaaS companies that process biometric data do not have this policy published. If you are subject to BIPA, you need one.
Do not use biometric data for purposes beyond what you disclosed. If your notice says you collect fingerprints for attendance tracking, you cannot use that data for product analytics, AI training, or any other purpose without separate consent. Scope creep in biometric data use is how BIPA claims originate.
Your DPA and vendor agreements need biometric-specific provisions. If your customer provides you with biometric data belonging to their employees or end users, your DPA needs to address BIPA compliance specifically: who is responsible for obtaining the required notice and consent, what purposes the data may be used for, what retention and destruction obligations apply, and what happens if BIPA is violated. Standard DPA language covering “personal data” generically does not satisfy BIPA’s specific requirements.
Audit your subprocessors for biometric data extraction and retention. This is the risk most founders miss. If your product sends audio to a third-party provider for transcription, speech-to-text, or voice analysis, you need to know whether that provider extracts or retains voiceprint data as part of its processing. Many speech-to-text APIs process acoustic features that constitute voiceprint characteristics, even if the primary output is a text transcript. Some providers retain audio or derived acoustic data for model improvement unless you explicitly opt out. If your subprocessor creates or retains biometric identifiers from data you send them, their data practices become your BIPA liability. You collected the data from the end user. You transmitted it to the subprocessor. The end user’s BIPA claim runs against you, not against your API provider.
The fix is concrete: review every subprocessor that touches audio, image, or video data. Confirm in writing whether the subprocessor extracts, retains, or uses biometric identifiers (voiceprints, facial geometry, fingerprints) from the data you send. If it does, ensure that the subprocessor’s data processing terms restrict use to the purposes you disclosed to the end user, prohibit retention beyond the processing window, and require deletion on the same schedule your published retention policy commits to. If the subprocessor cannot or will not make those commitments, it should not be processing biometric data on your behalf.
Insurance coverage is uncertain. Recent court decisions have limited insurance coverage for BIPA violations. General liability policies may or may not cover BIPA claims depending on how the policy defines “personal injury” and whether the court treats a BIPA violation as a covered “publication” of private information. If your product processes biometric data, confirm your coverage position with your broker before you need it, not after a claim is filed.
The Through-Line
This series has covered wiretapping claims against AI voice assistants, fabricated consent in medical records, BAA breaches by health-tech vendors, algorithmic discrimination by hiring tools, FTC enforcement against overstated AI marketing, and platform owners using computer fraud statutes to block AI agents.
The Clearview case adds the final layer: biometric data as the highest-risk category of personal information a SaaS product can process. BIPA’s combination of a private right of action, per-violation statutory damages, no proof of harm requirement, and the permanent, unchangeable nature of biometric identifiers creates a liability profile unlike any other privacy statute. The settlements prove it: $650 million for Facebook, $1.4 billion for Meta in Texas, $1.375 billion for Google in Texas, $51.75 million for Clearview.
For SaaS founders, the lesson is straightforward. If your product touches biometric data, you need to know it, disclose it, get consent for it, limit your use of it, and build your contracts around it. BIPA does not care whether you are a surveillance company or a seed-stage startup. The statute applies to the data, not to the size of the company processing it.
This is the seventh post in the AI Privacy Litigation series. Previous: Amazon v. Perplexity: When the Platform Sues Your AI Agent.
For the contractual framework on data processing terms and subprocessor management, see DPA Anatomy and Subprocessor Management in the DPA series.
No Boiler provides self-service legal document generation and educational content. This material and our service is not a substitute for legal advice. Please have a qualified attorney review any documents before relying on them. No Boiler is not a law firm, and communications with us do not create an attorney-client relationship or carry any expectation of confidentiality. Use of our platform and content is governed by our Terms of Service and Privacy Policy.