Back to Insights
AI Compliance & RegulationGuide

Illinois BIPA: The Strictest Biometric Privacy Law in America and What It Means for AI

February 12, 202613 min readMichael Lansdowne Hauge
Updated March 15, 2026
For:CISOLegal/ComplianceCHROIT Manager

Illinois BIPA is the most protective biometric privacy law in the US, with a private right of action and penalties up to $5,000 per violation. If your AI system processes facial recognition, voiceprints, or other biometric data, you need to comply.

Summarize and fact-check this article with:
Security technology professional examining access control systems in corporate environment

Key Takeaways

  • 1.Private right of action — individuals can sue directly without proving actual harm
  • 2.Penalties of $1,000 per negligent violation and $5,000 per intentional violation
  • 3.Each biometric scan without consent is a separate violation (Cothron v. White Castle, 2023)
  • 4.Written consent required BEFORE collecting any biometric data — implied consent is not sufficient
  • 5.Cannot sell, lease, trade, or profit from biometric data under any circumstances
  • 6.Has produced the largest privacy settlements in US history: Meta ($650M), BNSF Railway ($228M), Google ($100M)

What Is Illinois BIPA?

When the Illinois legislature passed the Biometric Information Privacy Act on October 3, 2008, artificial intelligence had not yet entered mainstream enterprise operations. Nearly two decades later, BIPA stands as the strongest biometric data privacy law in the United States, and its relevance has only sharpened as AI systems increasingly depend on facial recognition, voice analysis, emotion detection, and identity verification to function.

The statute's defining feature is its private right of action. Any individual whose biometric data is mishandled can file suit directly, without needing to demonstrate actual harm. That single provision has driven billions of dollars in settlements and elevated BIPA compliance from a legal footnote to a boardroom priority for every organization that touches biometric data.

Why BIPA Matters for AI Companies

The proliferation of AI has dramatically expanded the surface area of biometric data collection. Facial recognition now powers security systems, access control, customer identification, and age verification. Voice recognition underpins authentication protocols, customer service platforms, and meeting transcription tools. Fingerprint and iris scanning remain the backbone of employee time-tracking and building access. A newer generation of AI systems analyzes facial expressions and voice patterns for emotion detection, while gait recognition and other behavioral biometrics are gaining traction across security and retail applications.

The compliance trigger is straightforward: if your AI system collects, captures, processes, stores, or shares any of these data types from Illinois residents, BIPA applies. The law does not distinguish between a startup running a pilot program and a Fortune 500 enterprise operating at scale.

What Biometric Data Is Covered

BIPA defines two categories of protected data. The first is biometric identifiers, which include retina or iris scans, fingerprints, voiceprints, hand geometry scans, and face geometry (the specific measurements AI uses for facial recognition). The second is biometric information, a broader category encompassing any information derived from a biometric identifier that is used to identify an individual, regardless of how it is captured or converted.

What Is NOT Covered

The statute explicitly excludes writing samples, written signatures, photographs (though facial geometry extracted from photographs is covered), demographic data, physical descriptions, tattoo descriptions, and information captured in a medical or healthcare setting. Organizations should note that the photograph exclusion is narrow. The moment an AI system extracts geometric measurements from a photograph, those measurements fall squarely within BIPA's scope.

Core Requirements

1. Written Policy

Every private entity possessing biometric data must develop and publicly publish a written policy that establishes a retention schedule and guidelines for permanently destroying biometric data. Destruction must occur either when the initial purpose for collection has been fulfilled or within three years of the individual's last interaction with the entity, whichever comes first. The policy must also address guidelines for storing, transmitting, and protecting biometric data throughout its lifecycle.

Before collecting any biometric data, the entity must inform the subject in writing that biometric data is being collected and stored. The entity must further inform the subject in writing of the specific purpose and the length of time for which the data will be stored and used. Finally, the entity must receive a written release from the subject authorizing the collection and storage. This is a strict requirement. Implied consent, verbal consent, and disclosures buried in general terms of service are not sufficient.

3. No Sale or Profit

Private entities may not sell, lease, trade, or otherwise profit from a person's biometric data. This prohibition is absolute and admits no exceptions.

Biometric data may not be disclosed to third parties unless the subject consents, the disclosure completes a financial transaction requested by the subject, a valid warrant or subpoena compels it, or disclosure is required by state or federal law.

5. Security Standards

Entities must store, transmit, and protect biometric data using a standard of care that is reasonable given the sensitivity of the information and at least equal to the standard applied to other confidential and sensitive data within the organization.

Penalties and Enforcement

BIPA's enforcement mechanism is what separates it from every other privacy statute in the United States.

Private Right of Action

Any person aggrieved by a BIPA violation may sue. They do not need to prove actual harm or financial loss. The violation itself is sufficient to establish standing. Negligent violations carry a penalty of $1,000 per violation, while intentional or reckless violations carry $5,000 per violation.

Claims Accrue Per Violation

In its February 2023 ruling in Cothron v. White Castle, the Illinois Supreme Court held that a separate claim accrues each time biometric data is scanned or transmitted without consent, not merely on the first occasion. The practical implications are staggering. An employee who uses a fingerprint scanner to clock in every workday for three years without proper consent generates a separate violation with each scan. A company with 500 employees in that position could face millions of individual violations and billions of dollars in potential liability.

Major Settlements

The financial consequences are not hypothetical. Meta (formerly Facebook) paid $650 million in 2021 to settle claims arising from facial recognition in its photo tagging feature. Google settled for $100 million in 2022 over its Google Photos face-grouping technology. TikTok agreed to a $92 million settlement in 2022 for collecting biometric data from minors. BNSF Railway was hit with a $228 million jury verdict in 2023 for fingerprint scanning of truck drivers at its facilities.

How This Applies to AI Systems

Facial Recognition AI

Any AI system that captures, analyzes, or stores facial geometry falls within BIPA's reach, whether the use case involves security cameras, identity verification, age estimation, or customer analytics. Compliance requires written consent from every person whose face is scanned, a clear explanation of why the data is being collected and how long it will be retained, a publicly available biometric data policy, and a commitment never to sell or share the data without explicit consent.

Voice AI and Speech Recognition

AI systems that analyze voiceprints for voice authentication, customer service analytics, meeting transcription with speaker identification, or emotion detection face the same requirements. Each voice capture without consent constitutes a potential separate violation under the Cothron v. White Castle standard.

Employee Biometric Systems

AI-powered time clocks, building access systems, and attendance trackers that rely on fingerprints, facial recognition, or hand geometry must all comply. Notably, a significant share of BIPA litigation has involved employee-facing biometric systems, making this one of the highest-risk categories for organizations.

Computer Vision and Retail AI

Retail AI systems that use cameras to analyze customer behavior, detect shoplifting, or identify repeat customers through facial recognition must comply if any of those customers are Illinois residents. Geographic reach, not the physical location of the camera, determines applicability.

How to Comply

Step 1: Biometric Data Audit

The foundation of compliance is a comprehensive audit that identifies every point in the organization where biometric data is collected, processed, or stored. This should encompass employee-facing systems such as time clocks, building access, and devices; customer-facing systems including identity verification, facial recognition, and voice authentication; internal AI tools like meeting transcription with speaker identification and emotion analysis; and all third-party tools and vendors that process biometric data on the organization's behalf.

Step 2: Written Biometric Data Policy

Organizations must create and publish a policy that specifies what biometric data is collected and for what purpose, how long it is retained (with destruction required within three years of the individual's last interaction or upon fulfillment of the collection purpose), how the data is stored and protected, who has access, and how individuals can request deletion.

A clear, standalone written consent process must be implemented. The consent mechanism should be separate from general terms of service, clearly state what data is collected, explain why and for how long, require affirmative written consent (digital signatures are acceptable), and provide an opt-out option where feasible.

Step 4: Vendor Compliance

Organizations using third-party AI tools that process biometric data must ensure vendor agreements include BIPA compliance obligations, verify that vendors do not sell or share biometric data, confirm that vendors meet reasonable security standards, and include indemnification clauses covering BIPA violations.

Step 5: Security Measures

Security protections for biometric data must meet or exceed the standard applied to other confidential information within the organization. At a minimum, this means encryption at rest and in transit, access controls with logging, regular security audits, and incident response plans tailored specifically to biometric data breaches.

States with Similar Laws

While BIPA remains the most powerful biometric privacy statute in the country, other states have enacted their own protections. The Texas Capture or Use of Biometric Identifier Act (CUBI) requires consent but provides no private right of action, leaving enforcement solely to the state Attorney General. Washington State includes biometric identifiers in its privacy law but similarly offers no private right of action. The New York SHIELD Act incorporates biometric data into security breach notification requirements. California's CCPA and CPRA classify biometric data as sensitive personal information, triggering heightened protections under those frameworks.

Organizations building AI systems that process biometric data should be aware of overlapping regulatory requirements. NYC Local Law 144 may apply alongside BIPA if an AI hiring tool uses video analysis that captures biometric data. The EU AI Act classifies biometric identification systems as either high-risk or prohibited depending on the use case. The Colorado AI Act creates additional obligations when biometric AI is used for consequential decisions. GDPR Article 9 extends special category data protections to biometric data processed within the European Union. In practice, organizations operating across jurisdictions will need to design compliance programs that satisfy multiple overlapping regimes simultaneously.

Common Questions

Yes. BIPA applies to any private entity that collects, captures, or processes biometric data from Illinois residents, regardless of where the company is located. If your AI system handles biometric data from people in Illinois, you must comply.

Yes. BIPA is one of the only US privacy laws with a private right of action. Any individual whose biometric data is mishandled can file a lawsuit. They do not need to prove actual harm — the violation itself is sufficient. This has led to hundreds of class-action lawsuits and settlements totaling billions of dollars.

A photograph by itself is explicitly excluded from BIPA's definition of biometric data. However, if facial geometry is extracted from a photograph — such as through facial recognition AI — that extracted geometry IS biometric data covered by BIPA.

Biometric data must be permanently destroyed when the initial purpose for collection has been fulfilled, or within 3 years of the individual's last interaction with the entity — whichever comes first. Your written policy must specify your retention schedule.

Yes, digital consent mechanisms satisfy the written consent requirement, including electronic signatures and click-through consent forms. However, the consent must be specific to biometric data collection (not buried in general terms of service), clearly explain what data is collected and why, and be affirmatively granted by the individual.

Yes. Employee fingerprint and facial recognition time-tracking systems are one of the most common BIPA violation scenarios. Employers must provide written notice, obtain written consent, and publish a biometric data policy before requiring employees to use these systems.

References

  1. Biometric Information Privacy Act (740 ILCS 14). Illinois General Assembly (2008). View source
  2. Biometric Information Privacy Act (BIPA) Campaign. ACLU of Illinois (2024). View source
  3. Illinois Revises Biometrics Law To Reduce Prospect of Ruinous Damage Awards. Davis Wright Tremaine (2024). View source
  4. Year in Review: 2024 BIPA Litigation Takeaways. WilmerHale (2025). View source
  5. How Will the Recent Amendments to Illinois's BIPA Affect the Use of Biometric Data?. American Bar Association (2024). View source
  6. BIPA Update: Illinois Limits Liability and Clarifies Electronic Consent. Greenberg Traurig (2024). View source
  7. Illinois Reins in Astronomical Damages Under Biometric Privacy Law. Foley & Lardner (2024). View source
Michael Lansdowne Hauge

Managing Partner · HRDF-Certified Trainer (Malaysia), Delivered Training for Big Four, MBB, and Fortune 500 Clients, 100+ Angel Investments (Seed–Series C), Dartmouth College, Economics & Asian Studies

Advises leadership teams across Southeast Asia on AI strategy, readiness, and implementation. HRDF-certified trainer with engagements for a Big Four accounting firm, a leading global management consulting firm, and the world's largest ERP software company.

AI StrategyAI GovernanceExecutive AI TrainingDigital TransformationASEAN MarketsAI ImplementationAI Readiness AssessmentsResponsible AIPrompt EngineeringAI Literacy Programs

EXPLORE MORE

Other AI Compliance & Regulation Solutions

INSIGHTS

Related reading

Talk to Us About AI Compliance & Regulation

We work with organizations across Southeast Asia on ai compliance & regulation programs. Let us know what you are working on.