Lawsuit Tests Whether AI Hiring Tools Must Follow FCRA Rules

AI hiring tool concept

When it comes to AI, there is a notable lack of legal precedent to govern and hold AI systems accountable, with recruiting among the principal sectors under the microscope.  Modern AI recruiting platforms are becoming so efficient at their tasks that people are beginning to question whether certain functions and outputs of these systems are crossing a regulatory line. An example of this very question has now made its way to the courts, drawing greater focus to the issue.

The case, Kistler et al. v. Eightfold AI Inc., is being closely watched by many because it outlines an important question: when does an AI recruiting tool transition from analyzing and reporting standard hiring data to curating a consumer report under the Fair Credit Reporting Act (FCRA)?

A consumer report, as defined by the FCRA, is any written, oral, or other communication by a consumer reporting agency (CRA) bearing on a consumer’s creditworthiness, credit standing, general reputation, or character that is used or expected to be used to determine (in whole or in part) a consumer’s eligibility for credit or employment.

What Is the Lawsuit?

Two California job seekers, Erin Kistler and Sruti Bhaumik, filed a lawsuit in January 2026 in the Northern District of California against Eightfold AI, a widely used AI hiring platform.

The claim is that Eightfold’s system gathers information from social media platforms (such as LinkedIn) and uses other data from public and purchased sources to generate a numerical score and curate a detailed individual profile that employers can then use to rank and screen applicants. This can all be accomplished before a human recruiter lays eyes on a single resume. Given that these curated reports are capable of structured analysis based on an individual’s character, reputation, and personal characteristics, and used for employment decisions, the plaintiffs argue that these AI outputs meet the criteria of consumer reports.

Furthermore, the plaintiffs argue that Eightfold does not adhere to FCRA guidelines despite curating what should be considered consumer reports. All consumer reports require consent. Consumer reports also require opportunities for individuals to review, dispute, and correct any erroneous information before adverse action is executed.

According to Eightfold, the platform operates solely on information voluntarily shared by candidates and data provided by its customers and does not actively scrape social media.

Why This Case Matters

This lawsuit is among the first to claim that AI platforms are acting as CRAs. While we may traditionally think of CRAs as brick-and-mortar background screening companies or one of the credit bureau titans, the landscape may be changing. FCRA regulation may have to begin accounting for technological evolution.

This case is important for employers and CRAs alike because it may set a precedent moving forward.

A ruling in favor of the plaintiffs could mean: 

  • AI vendors must add full FCRA disclosures, authorizations, and dispute rights to their platforms.
  • CRAs may need to review contracts with AI partners to ensure proper certifications and compliance processes are in place.
  • Employers could risk FCRA non-compliance if they fail to apply adverse action protocols to decisions driven by AI outputs deemed to constitute a consumer report.

Even if the case settles before a definitive ruling, it is already affecting how organizations view AI technologies used in hiring. It is beginning to highlight potential issues with transparency and compliance that could have major implications down the road.

Current Status

The class-action lawsuit is still in its early stages. Eightfold has received an extension to respond to the complaint, and no rulings on dismissal or class certification have yet been issued. Legal experts expect the case to move forward slowly, but it is already generating discussion in industry circles about the future of AI in hiring.

Share our Blog

Facebook
Twitter
LinkedIn