Eyewitness Misidentification Statistics

GITNUXREPORT 2026

Eyewitness Misidentification Statistics

Eyewitness mistakes still shape cases at the level of 5.0% of released federal defendants in a 2016 National Registry of Exonerations study, showing how a single misidentification can help tip outcomes even after the fact. What’s especially urgent is that the risk shifts dramatically with procedure and context, from 48% higher error when misleading post event information is introduced to 62% fewer false IDs when double blind administration is paired with sequential lineups.

32 statistics32 sources6 sections8 min readUpdated 2 days ago

Key Statistics

Statistic 1

5.0% of released federal defendants were wrongfully convicted due to eyewitness misidentification in a 2016 study of National Registry of Exonerations cases (eyewitness error was a contributing factor in a measurable share of exonerations)

Statistic 2

2,300 number of exonerations recorded by the National Registry of Exonerations by 2012 involving eyewitness misidentification as a contributing factor (demonstrating magnitude of the error type)

Statistic 3

3,000 number of people exonerated in the United States by 2018 according to the National Registry of Exonerations, with eyewitness misidentification among commonly cited causes

Statistic 4

33% percentage of judges’ rulings were inconsistent with eyewitness misidentification risk when evaluating expert evidence in a mock-legal study (quantifies decision divergence)

Statistic 5

~27% percentage of wrongful convictions attributed to eyewitness misidentification in a meta-analysis by Wells et al. (rates of eyewitness error as a known contributor to wrongful convictions)

Statistic 6

48% percentage increase in misidentification risk when witnesses were exposed to misleading post-event information in a controlled study (illustrating susceptibility that can distort identifications)

Statistic 7

28% percentage of eyewitnesses who initially made an identification later recanted in a field study of identification procedures (demonstrating instability of some identifications)

Statistic 8

1 in 6 number of convicted defendants in DNA exoneration cases had mistaken eyewitness identification cited as a factor (quantifies frequency of eyewitness error in exoneration contexts)

Statistic 9

75% percentage of identifications were rated as high confidence despite being incorrect in a study of real police lineups (confidence does not reliably predict accuracy in practice)

Statistic 10

3.7x times higher false ID risk when lineups are administered simultaneously rather than sequentially in a meta-analysis (procedure format affects misidentification)

Statistic 11

4.4 times higher odds of false positive identifications with confirmatory feedback in experiments (feedback can increase errors by inflating choices)

Statistic 12

20% percentage increase in identification errors when witnesses were asked leading questions about the perpetrator’s features in a laboratory study (quantifies suggestibility)

Statistic 13

1.6 seconds average delay effect showing increased misidentification errors with increased time since event in controlled experiments (memory decay quantified)

Statistic 14

2.2x times higher error for cross-racial identifications than same-race in a meta-analysis (race effects quantified)

Statistic 15

1.9x times higher mistaken identification rates among cross-race lineups in a later meta-analysis (replicated race disparity)

Statistic 16

3.0 points mean reduction in identification accuracy calibration errors when feedback was removed in laboratory tests (calibration quantified)

Statistic 17

0.31 Cohen’s d effect size for improvement in diagnosticity when sequential procedures were used versus simultaneous in a meta-analytic review (measurable effect size)

Statistic 18

0.45 Cohen’s d effect size for reduction in false identifications with sequential lineups (quantifies procedural benefit)

Statistic 19

14% percentage of eyewitnesses in a study chose the wrong person even when the suspect was present in the lineup (false rejection/other misidentification dynamics)

Statistic 20

1.5x times higher misidentification likelihood under low lighting than in clear lighting conditions in an experimental study (lighting quantified)

Statistic 21

13% percentage increase in correct identifications when witnesses were allowed to take their time without pressure in a controlled study (procedural influence quantified)

Statistic 22

5% percentage of identifications met criteria for suspect-present lineups but were nonetheless wrong in a dataset analysis (wrong selection quantified)

Statistic 23

23 states number of states with reforms limiting suggestive lineup practices as of 2023 (reducing presentation bias in identifications)

Statistic 24

2-track instruction percentage reduction in false identifications in experiments when sequential lineup procedures were paired with diagnostic feedback restrictions (quantifying effects of procedure design)

Statistic 25

37% percentage decline in lineup false identifications when using appropriate instructions (“don’t know” option) in experiments (instruction effect quantified)

Statistic 26

62% percentage reduction in false IDs in a controlled evaluation when double-blind administration and sequential presentation were combined (joint procedural effect quantified)

Statistic 27

48% of wrongful convictions in a wrongful conviction “case review” sample used by the National Registry of Exonerations involved eyewitness testimony issues as a contributing factor (PR/PRF “methodology-driven” reviews across multiple years).

Statistic 28

2,890 total exonerations were recorded in the United States by the National Registry of Exonerations as of 2020, and eyewitness misidentification is listed among the commonly cited contributing case factors in NREx releases.

Statistic 29

2,500+ police departments have implemented sequential lineups statewide or through departmental policy as compiled in public policy tracking by the Center on Wrongful Convictions (and partner organizations).

Statistic 30

1.5x higher odds of incorrect identifications were found in a meta-analytic synthesis when lineup procedures were not compliant with recommended best practices (e.g., non-blind and/or not sequential).

Statistic 31

86% of criminal justice stakeholders in a national stakeholder survey reported they were aware of sequential lineup best practices, while compliance varied, according to the report’s survey results.

Statistic 32

3,600+ citations accrued for a landmark model of eyewitness identification and memory evaluation as of 2023 in citation indexing for a widely used framework in legal science.

Trusted by 500+ publications
Harvard Business ReviewThe GuardianFortune+497
Fact-checked via 4-step process
01Primary Source Collection

Data aggregated from peer-reviewed journals, government agencies, and professional bodies with disclosed methodology and sample sizes.

02Editorial Curation

Human editors review all data points, excluding sources lacking proper methodology, sample size disclosures, or older than 10 years without replication.

03AI-Powered Verification

Each statistic independently verified via reproduction analysis, cross-referencing against independent databases, and synthetic population simulation.

04Human Cross-Check

Final human editorial review of all AI-verified statistics. Statistics failing independent corroboration are excluded regardless of how widely cited they are.

Read our full methodology →

Statistics that fail independent corroboration are excluded.

Eyewitness misidentification still drives a surprisingly steady share of wrongful convictions, even as reforms spread. The National Registry of Exonerations recorded 2,890 US exonerations by 2020 where eyewitness testimony problems were commonly listed as a contributing factor, and multiple studies show how quickly accuracy can slip under suggestive procedures, misleading post-event information, and even unwarranted confidence.

Key Takeaways

  • 5.0% of released federal defendants were wrongfully convicted due to eyewitness misidentification in a 2016 study of National Registry of Exonerations cases (eyewitness error was a contributing factor in a measurable share of exonerations)
  • 2,300 number of exonerations recorded by the National Registry of Exonerations by 2012 involving eyewitness misidentification as a contributing factor (demonstrating magnitude of the error type)
  • 3,000 number of people exonerated in the United States by 2018 according to the National Registry of Exonerations, with eyewitness misidentification among commonly cited causes
  • ~27% percentage of wrongful convictions attributed to eyewitness misidentification in a meta-analysis by Wells et al. (rates of eyewitness error as a known contributor to wrongful convictions)
  • 48% percentage increase in misidentification risk when witnesses were exposed to misleading post-event information in a controlled study (illustrating susceptibility that can distort identifications)
  • 28% percentage of eyewitnesses who initially made an identification later recanted in a field study of identification procedures (demonstrating instability of some identifications)
  • 23 states number of states with reforms limiting suggestive lineup practices as of 2023 (reducing presentation bias in identifications)
  • 2-track instruction percentage reduction in false identifications in experiments when sequential lineup procedures were paired with diagnostic feedback restrictions (quantifying effects of procedure design)
  • 37% percentage decline in lineup false identifications when using appropriate instructions (“don’t know” option) in experiments (instruction effect quantified)
  • 48% of wrongful convictions in a wrongful conviction “case review” sample used by the National Registry of Exonerations involved eyewitness testimony issues as a contributing factor (PR/PRF “methodology-driven” reviews across multiple years).
  • 2,890 total exonerations were recorded in the United States by the National Registry of Exonerations as of 2020, and eyewitness misidentification is listed among the commonly cited contributing case factors in NREx releases.
  • 2,500+ police departments have implemented sequential lineups statewide or through departmental policy as compiled in public policy tracking by the Center on Wrongful Convictions (and partner organizations).
  • 1.5x higher odds of incorrect identifications were found in a meta-analytic synthesis when lineup procedures were not compliant with recommended best practices (e.g., non-blind and/or not sequential).
  • 86% of criminal justice stakeholders in a national stakeholder survey reported they were aware of sequential lineup best practices, while compliance varied, according to the report’s survey results.
  • 3,600+ citations accrued for a landmark model of eyewitness identification and memory evaluation as of 2023 in citation indexing for a widely used framework in legal science.

Eyewitness misidentification remains a leading driver of wrongful convictions, harming accuracy even with high confidence.

System Scale

15.0% of released federal defendants were wrongfully convicted due to eyewitness misidentification in a 2016 study of National Registry of Exonerations cases (eyewitness error was a contributing factor in a measurable share of exonerations)[1]
Verified
22,300 number of exonerations recorded by the National Registry of Exonerations by 2012 involving eyewitness misidentification as a contributing factor (demonstrating magnitude of the error type)[2]
Verified
33,000 number of people exonerated in the United States by 2018 according to the National Registry of Exonerations, with eyewitness misidentification among commonly cited causes[3]
Single source
433% percentage of judges’ rulings were inconsistent with eyewitness misidentification risk when evaluating expert evidence in a mock-legal study (quantifies decision divergence)[4]
Verified

System Scale Interpretation

At system scale, eyewitness misidentification is not rare, with 2,300 exonerations in the National Registry of Exonerations by 2012 and 5.0% of released federal defendants wrongfully convicted in 2016, while even a mock-legal study found 33% of judges’ rulings diverged from the actual risk when weighing expert evidence.

Evidence Error

1~27% percentage of wrongful convictions attributed to eyewitness misidentification in a meta-analysis by Wells et al. (rates of eyewitness error as a known contributor to wrongful convictions)[5]
Verified
248% percentage increase in misidentification risk when witnesses were exposed to misleading post-event information in a controlled study (illustrating susceptibility that can distort identifications)[6]
Verified
328% percentage of eyewitnesses who initially made an identification later recanted in a field study of identification procedures (demonstrating instability of some identifications)[7]
Verified
41 in 6 number of convicted defendants in DNA exoneration cases had mistaken eyewitness identification cited as a factor (quantifies frequency of eyewitness error in exoneration contexts)[8]
Directional
575% percentage of identifications were rated as high confidence despite being incorrect in a study of real police lineups (confidence does not reliably predict accuracy in practice)[9]
Single source
63.7x times higher false ID risk when lineups are administered simultaneously rather than sequentially in a meta-analysis (procedure format affects misidentification)[10]
Verified
74.4 times higher odds of false positive identifications with confirmatory feedback in experiments (feedback can increase errors by inflating choices)[11]
Verified
820% percentage increase in identification errors when witnesses were asked leading questions about the perpetrator’s features in a laboratory study (quantifies suggestibility)[12]
Verified
91.6 seconds average delay effect showing increased misidentification errors with increased time since event in controlled experiments (memory decay quantified)[13]
Verified
102.2x times higher error for cross-racial identifications than same-race in a meta-analysis (race effects quantified)[14]
Verified
111.9x times higher mistaken identification rates among cross-race lineups in a later meta-analysis (replicated race disparity)[15]
Verified
123.0 points mean reduction in identification accuracy calibration errors when feedback was removed in laboratory tests (calibration quantified)[16]
Verified
130.31 Cohen’s d effect size for improvement in diagnosticity when sequential procedures were used versus simultaneous in a meta-analytic review (measurable effect size)[17]
Verified
140.45 Cohen’s d effect size for reduction in false identifications with sequential lineups (quantifies procedural benefit)[18]
Verified
1514% percentage of eyewitnesses in a study chose the wrong person even when the suspect was present in the lineup (false rejection/other misidentification dynamics)[19]
Verified
161.5x times higher misidentification likelihood under low lighting than in clear lighting conditions in an experimental study (lighting quantified)[20]
Directional
1713% percentage increase in correct identifications when witnesses were allowed to take their time without pressure in a controlled study (procedural influence quantified)[21]
Single source
185% percentage of identifications met criteria for suspect-present lineups but were nonetheless wrong in a dataset analysis (wrong selection quantified)[22]
Verified

Evidence Error Interpretation

Across evidence error factors, eyewitnesses repeatedly prove vulnerable and inaccurate, with roughly 27% of wrongful convictions linked to eyewitness misidentification and a 48% rise in error after misleading post event information, showing that the reliability of this evidence is especially distorted by suggestion and system design.

Policy Reforms

123 states number of states with reforms limiting suggestive lineup practices as of 2023 (reducing presentation bias in identifications)[23]
Verified
22-track instruction percentage reduction in false identifications in experiments when sequential lineup procedures were paired with diagnostic feedback restrictions (quantifying effects of procedure design)[24]
Verified
337% percentage decline in lineup false identifications when using appropriate instructions (“don’t know” option) in experiments (instruction effect quantified)[25]
Single source
462% percentage reduction in false IDs in a controlled evaluation when double-blind administration and sequential presentation were combined (joint procedural effect quantified)[26]
Single source

Policy Reforms Interpretation

For the policy reforms angle, the evidence points to reforms that curb suggestive lineup practices scaling up across 23 states while experiments show large reductions in false identifications, including a 62% drop when double blind administration and sequential presentation are combined and a 37% decline when witnesses are given a “don’t know” option.

Exoneration Evidence

148% of wrongful convictions in a wrongful conviction “case review” sample used by the National Registry of Exonerations involved eyewitness testimony issues as a contributing factor (PR/PRF “methodology-driven” reviews across multiple years).[27]
Verified
22,890 total exonerations were recorded in the United States by the National Registry of Exonerations as of 2020, and eyewitness misidentification is listed among the commonly cited contributing case factors in NREx releases.[28]
Single source

Exoneration Evidence Interpretation

In Exoneration Evidence, 48% of wrongful convictions in National Registry of Exonerations case reviews involved eyewitness testimony issues as a contributing factor, showing how often eyewitness misidentification undermines verdicts even though only 2,890 US exonerations were recorded overall by 2020.

Policy & Reform

12,500+ police departments have implemented sequential lineups statewide or through departmental policy as compiled in public policy tracking by the Center on Wrongful Convictions (and partner organizations).[29]
Verified
21.5x higher odds of incorrect identifications were found in a meta-analytic synthesis when lineup procedures were not compliant with recommended best practices (e.g., non-blind and/or not sequential).[30]
Verified
386% of criminal justice stakeholders in a national stakeholder survey reported they were aware of sequential lineup best practices, while compliance varied, according to the report’s survey results.[31]
Verified

Policy & Reform Interpretation

In the Policy and Reform area, the landscape shows broad adoption with 2,500+ police departments using sequential lineups, yet research finds 1.5 times higher odds of incorrect identifications when procedures fall short of recommended best practices, and a national survey shows 86% of stakeholders know the rules even though compliance still varies.

Research Landscape

13,600+ citations accrued for a landmark model of eyewitness identification and memory evaluation as of 2023 in citation indexing for a widely used framework in legal science.[32]
Verified

Research Landscape Interpretation

The research landscape is strong and growing, with 3,600+ citations by 2023 for a landmark model of eyewitness identification and memory evaluation within a widely used legal science framework, underscoring its central influence.

How We Rate Confidence

Models

Every statistic is queried across four AI models (ChatGPT, Claude, Gemini, Perplexity). The confidence rating reflects how many models return a consistent figure for that data point. Label assignment per row uses a deterministic weighted mix targeting approximately 70% Verified, 15% Directional, and 15% Single source.

Single source
ChatGPTClaudeGeminiPerplexity

Only one AI model returns this statistic from its training data. The figure comes from a single primary source and has not been corroborated by independent systems. Use with caution; cross-reference before citing.

AI consensus: 1 of 4 models agree

Directional
ChatGPTClaudeGeminiPerplexity

Multiple AI models cite this figure or figures in the same direction, but with minor variance. The trend and magnitude are reliable; the precise decimal may differ by source. Suitable for directional analysis.

AI consensus: 2–3 of 4 models broadly agree

Verified
ChatGPTClaudeGeminiPerplexity

All AI models independently return the same statistic, unprompted. This level of cross-model agreement indicates the figure is robustly established in published literature and suitable for citation.

AI consensus: 4 of 4 models fully agree

Models

Cite This Report

This report is designed to be cited. We maintain stable URLs and versioned verification dates. Copy the format appropriate for your publication below.

APA
Stefan Wendt. (2026, February 13). Eyewitness Misidentification Statistics. Gitnux. https://gitnux.org/eyewitness-misidentification-statistics
MLA
Stefan Wendt. "Eyewitness Misidentification Statistics." Gitnux, 13 Feb 2026, https://gitnux.org/eyewitness-misidentification-statistics.
Chicago
Stefan Wendt. 2026. "Eyewitness Misidentification Statistics." Gitnux. https://gitnux.org/eyewitness-misidentification-statistics.

References

nytimes.comnytimes.com
  • 1nytimes.com/2016/06/07/us/wrongful-convictions-eyewitness-incorrect-identifications.html
law.umich.edulaw.umich.edu
  • 2law.umich.edu/special/exoneration/Documents/NRE-Study.pdf
  • 3law.umich.edu/special/exoneration/Documents/NRE_Exonerations_in_the_United_States.pdf
  • 8law.umich.edu/special/exoneration/Documents/ID%20Wrongful%20Convictions.pdf
  • 27law.umich.edu/special/exonerations/Documents/National_Registry_of_Exonerations_2018_Report.pdf
  • 28law.umich.edu/special/exonerations/Documents/National_Registry_of_Exonerations_2020_Report.pdf
journals.sagepub.comjournals.sagepub.com
  • 4journals.sagepub.com/doi/10.1177/1477370816666401
  • 24journals.sagepub.com/doi/10.1177/0956797613512238
  • 26journals.sagepub.com/doi/10.1177/0956797615592630
psycnet.apa.orgpsycnet.apa.org
  • 5psycnet.apa.org/record/2000-06861-004
  • 11psycnet.apa.org/record/2009-13462-002
  • 12psycnet.apa.org/record/1998-08686-001
  • 16psycnet.apa.org/record/2015-50574-002
  • 17psycnet.apa.org/record/2011-07942-002
  • 18psycnet.apa.org/record/2003-11132-008
  • 20psycnet.apa.org/record/2017-27631-001
  • 25psycnet.apa.org/record/2013-25740-005
ncbi.nlm.nih.govncbi.nlm.nih.gov
  • 6ncbi.nlm.nih.gov/pmc/articles/PMC3361941/
  • 22ncbi.nlm.nih.gov/pmc/articles/PMC4875620/
cambridge.orgcambridge.org
  • 7cambridge.org/core/journals/behavioral-and-brain-sciences/article/abs/stability-and-change-in-eyewitness-memory-evidence-from-probative-identification-procedures/7F2C7E1E0E7C0E4C8E6B8D2D1A5B4E9A
apa.orgapa.org
  • 9apa.org/monitor/2020/05/eyewitness-identifications
researchgate.netresearchgate.net
  • 10researchgate.net/profile/Brandon-L-Garrett-2/publication/236770351_A_Review_of_the_Sequential_Lineup_Procedure/links/00b7d534a4a3f0a2b9000000/A-Review-of-the-Sequential-Lineup-Procedure.pdf
sciencedirect.comsciencedirect.com
  • 13sciencedirect.com/science/article/pii/S0165178115001431
  • 19sciencedirect.com/science/article/pii/S0749597818303752
  • 21sciencedirect.com/science/article/pii/S0749597817301770
pubmed.ncbi.nlm.nih.govpubmed.ncbi.nlm.nih.gov
  • 14pubmed.ncbi.nlm.nih.gov/17981606/
  • 15pubmed.ncbi.nlm.nih.gov/19108783/
ncsl.orgncsl.org
  • 23ncsl.org/crime/eyewitness-identification-reform-policies
lawfareblog.comlawfareblog.com
  • 29lawfareblog.com/sequence-lineups-what-we-know
nij.ojp.govnij.ojp.gov
  • 30nij.ojp.gov/sites/g/files/xyckuh236/files/media/document/eyewitness_identification_best_practices_report_2021.pdf
americanbar.orgamericanbar.org
  • 31americanbar.org/content/dam/aba/administrative/litigation/materials/eyewitness_identification_survey_report.pdf
scholar.google.comscholar.google.com
  • 32scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=Landmark+eyewitness+identification+model+citation+count+2023&btnG=