Disparate Impact
A facially neutral policy, practice, or algorithm that disproportionately harms a group based on a protected characteristic — even without discriminatory intent. In AI, disparate impact commonly occurs when models trained on historically biased data reproduce or amplify those patterns in their outputs.
Why It Matters
Disparate impact is how most AI discrimination actually manifests. The algorithm doesn't need to 'intend' to discriminate — if the outcomes are disproportionately harmful to a protected group, legal liability exists regardless of intent.
Example
An AI resume screening tool trained on a company's historical hiring data — which skewed heavily male in engineering roles — learns to downrank resumes that mention 'women's college' or 'women in tech,' creating disparate impact against female applicants without any explicit gender filter.
Think of it like...
Disparate impact is like a height requirement for firefighters — it doesn't mention gender, but it disproportionately excludes women. The rule looks neutral; the effect is not.
Related Terms
Disparate Treatment
Intentional discrimination based on a protected characteristic. In AI systems, disparate treatment occurs when protected attributes like race, gender, or age are explicitly used as input features for decision-making, or when different rules are applied to different groups by design.
Bias (AI)
Systematic errors in AI system outputs that produce unfair or skewed outcomes. AI bias can originate from training data (historical, representation, measurement, sampling, or aggregation bias), from model design choices, or from the deployment context. Bias is not always obvious and can compound through the AI lifecycle.
Fairness (AI)
The principle that AI systems should produce equitable outcomes and not discriminate against individuals or groups based on protected characteristics. Multiple mathematical definitions of fairness exist — demographic parity, equalized odds, individual fairness, and others — and they frequently conflict with each other, making fairness a design choice, not a single metric.