Unvalidated Forensic Science

Toolmark Analysis and Compositional Analysis of Bullet Lead (CABL):
Toolmarks are the impressions that a firearm’s interior structure makes on the brass or lead of bullets that are fired from it.  Firearms examiners often testify that striations created by an individual gun are unique and that by analyzing marks on bullets found at a crime scene, one can trace those bullets to the gun(s) that fired them. [1]  However, a lack of statistical data hinders examiners’ ability to reliably trace ammunition back to a particular firearm.  The scientific community is currently unaware of the variabilities among different firearms’ toolmarks.  Consequently, there is no way to determine how many points of similarity are necessary to match a bullet to a gun. [2] Additionally, since the 1990s, a universal database of toolmarks has revealed that many firearms of the same make and model create virtually indistinguishable marks.  This indicates that bullets cannot necessarily be traced back to a specific gun. [3]  These findings make toolmark analysis an inherently unreliable form of forensic science.


Comparative Bullet Lead Analysis (CBLA):
The scientific community has also recently called into question testimony relating to compositional analysis of bullet lead, or CABL.  CABL is a process by which forensic scientists compare the elemental components of bullet lead to determine if different bullets have a similar makeup. [4] FBI agents have testified in CABL cases that two or more bullets originated from the same box of ammunition, or from another box of ammunition produced at the same place on or about the same date. [5]  The agents then used this testimony to link bullets found at the crime scene to bullets in a defendant’s possession. [6]
After a twelve-month study on CABL, the National Research Council published a report of their findings in 2004, titled Forensic Analysis: Weighing Bullet Lead Evidence.  The National Research Council concluded that: (1) “[t]he opinions in some cases indicate that prosecutors and courts have overstated the probative impact of [CABL bullet] matching evidence,” (2) “references to ‘boxes’ of ammunition in any form are seriously misleading,” and (3) “testimony that the crime bullet came from the defendant’s box or from a box manufactured at the same time, is also objectionable because it may be understood as implying a substantial probability that the bullet came from the defendant’s box.” [7] Based on the Council’s findings, the FBI discontinued its practice of providing CABL testimony in 2005.[8] Nonetheless, many of those convicted based on CABL evidence remain imprisoned.


Arson Science:
Over the past decade, fire investigation techniques that have been used for over forty years have been cast into serious doubt.  This has provided a new avenue for challenging wrongful convictions.
A primary goal of arson investigation is to ascertain whether accelerant was used to create the fire.  Arson investigators then testify that the use of accelerant shows intentional human action, as opposed to accidental, electrical, or natural causation.  Despite initial heated debate as to its conclusions, the National Fire Protection Association’s arson and fire investigation guide, NFPA 921, is now treated as the authoritative source on proper accelerant detection procedures. [9] NFPA 921 outlines investigation best practices and disbands many of the myths that have formed the basis of arson investigation for decades. [10] Specifically, NFPA 921 provides that, without additional testing and information, the following “indicators” are not reliable signs that accelerant was present at the scene of a fire:

Irregular burn patterns (also referred to as pour patterns),[11]
Certain char patterns,[12]
Crazed glass,[14]
Alligatoring of wood,[15] and
Clean burn areas.[16]

NFPA 921 also sheds light on the proper use of accelerant-sniffing dogs.[17]  Specifically, it finds that a dog’s alert should be used in conjunction with, and not in place of, other fire investigation methods.[18]  While drug or explosive-detecting dog alerts are often relied upon to show the presence of contraband, accelerant-sniffing dogs often alert to items of everyday household use.  In a recent NEIP case, defense counsel’s failure to object to testimony from the handler of an accelerant-sniffing dog formed the basis for overturning an arson conviction.[19]  In United States v. Hebshie, United States District Court Judge Nancy Gertner vacated the conviction, recognizing that “the very danger Daubert…sought to avoid occurred: questionable theorizing about arson…[was] presented as ‘science’ to the jury, and as a result Mr. Hebshie was convicted.”[20]

Read About The Evolution of Fire Investigation and Its Impact on Arson Cases

Read the NFPA 921 Report

Read Gertner’s Opinion in the Hebshie case


Although fingerprint evidence has long been considered a reliable way of identifying criminals, the infallibility of this forensic science method has recently been called into question.  Crime scene investigators use partial latent fingerprints found at a crime scene to identify potential suspects.  Yet, recent studies have shown that fingerprint analysis is not as accurate as once thought.[21] Although the scientific community does not doubt that each person’s fingerprints are unique, questions have been raised as to whether partial fingerprints can be considered a match to fingerprints made when one’s fingers are placed flat to make an inkprint.[22] The impression left by a finger differs each time it comes into contact with a surface based on the angle of the finger, the amount of pressure the finger exerts on the surface, and how much of the finger is captured in the impression.[23] Thus, even though fingerprints are unique, due to distortions in prints taken from a crime scene, prints from two different people may be confused, and impressions made by the same finger may not look similar enough for an examiner to identify them as coming from the same source.[24]

This problem is exacerbated by the lack of standards for fingerprint examiners and the subjectivity involved in fingerprint analysis.  Many agencies that conduct fingerprint analysis do not require that latent print examiners be certified.  This is particularly problematic since the results of fingerprint analysis are largely a product of human interpretation.  There are no particular measurements or standard test protocols for fingerprint analysis.[25] This means that a fingerprint examination is not necessarily consistent from examiner to examiner, and studies show that even an analysis made by the same examiner may not be the same when repeated at a later time in a different context.[26] Although the Scientific Working Group on Friction Ridge Analysis, Study and Technology has developed a technique for analyzing crime scene fingerprints, the proposed method does not protect against examiner bias, does not ensure that results will be the same when the analysis is repeated, and does not guarantee that two examiners who follow the method will obtain the same results.[27]  A study analyzing the scientific validity of this method produced the following conclusion: “We have reviewed available scientific evidence of the validity of the [fingerprint analysis] method and found none.”[28]
In recent years, the legal community has begun to recognize the shortcomings of fingerprint testimony.  Jennifer Mnookin, Professor of Law at the University of California, Los Angeles School of Law, studied fingerprint analysis and came to the following conclusion: “Given the general lack of validity testing for fingerprinting; the relative dearth of difficult proficiency tests; the lack of a statistically valid model of fingerprinting; and the lack of validated standards for declaring a match, . . . claims of absolute, certain confidence in identification are unjustified. . . . Therefore, in order to pass scrutiny under Daubert, fingerprint identification experts should exhibit a greater degree of epistemological humility.  Claims of ‘absolute’ and ‘positive’ identification should be replaced by more modest claims about the meaning and significance of a ‘match.’”[29] 

Additionally, in an October 2007 ruling, a Maryland court found that a fingerprint expert was not allowed to testify that a crime scene print was a match to the defendant.  In that case, Baltimore County Circuit Judge Susan M. Souder noted that traditional fingerprinting analysis is “a subjective, untested, unverifiable identification procedure that purports to be infallible.”[30]


Case Study- Terry Patterson:
In 2005, NEIP filed an amicus brief in the case of Commonwealth v. Patterson, challenging the fingerprint testimony that was used to convict the defendant. Patterson was convicted in 1995 for the murder of Boston Police Detective John Milligan. Patterson’s conviction was vacated in 2000 based on ineffective assistance of counsel, and before retrial, Patterson moved to exclude the fingerprint analysis testimony on which his conviction was based. At trial, a member of the Boston Police Department had testified that four fingerprints found on the window of the truck in which the police detective was shot matched Patterson’s fingerprints. This witness explained that although the fingerprints could not be linked to Patterson when considered individually, when considered as a whole, they were a match to fingerprints from one of Patterson’s hands.

The Massachusetts Supreme Judicial Court held that analysis related to fingerprints that have been grouped together for examination, known as “simultaneous impressions,” does not meet the reliability standards set forth in Daubert and Lanigan. The court stated that the theory and methodology of latent fingerprint identification cannot be “applied reliably to simultaneous impressions not capable of being individually matched to any of the fingers that supposedly made them.” Read NEIP’s amicus brief for Patterson


[1] National Research Council, Strengthening Forensic Science in the United States: A Path Forward 5-18 (The National Academies Press 2009), available at http://www.nap.edu/catalog.php?record_id=12589. 

[2] Id. at 5-21.

[3] Adina Schwartz, A Systemic Challenge to the Reliability and Admissibility of Firearms and Toolmark Identification, 6 Colum. Sci. & Tech. L. Rev. 2 (2005) (citing Joseph J. Masson, Confidence Level Variations in Firearms Identification through Computerized Technology, 29(1) Ass’n Firearms & Tool Mark Examiners J. 42 (1997)).

[4] National Research Council, Forensic Analysis Weighing Bullet Lead Evidence 1 (The National Academies Press 2004), available at http://www.nap.edu/catalog.php?record_id=10924.

[5] See, e.g., Com v. Daye, 587 N.E.2d 194 (Mass. 1992); National Research Council, Forensic Analysis Weighing Bullet Lead Evidence 113 (The National Academies Press 2004) available at http://www.nap.edu/catalog.php?record_id=10924.

[6] See, e.g., Com. v. Daye, 587 N.E.2d 194 (Mass. 1992).

[7] National Research Council, Forensic Analysis Weighing Bullet Lead Evidence 94, 113 (The National Academies Press 2004) available at http://www.nap.edu/catalog.php?record_id=10924.

[8] Press Release, Federal Bureau of Investigation, FBI Laboratory Announces Discontinuation of Bullet Lead Examinations (Sept. 1, 2005).

[9] See Daniel L. Churchward, NFPA 921: Past Present and Future, International Symposium on Fire Investigation (Kodiak Enterprises, Inc., USA 2006).

[10] See National Fire Protection Association, NFPA 921 Guide for Fire and Explosion Investigations (2008 Ed.).

[11] Id. at §

[12] Id. at §

[13] Id. at §

[14] Id. at §

[15] Id. at §

[16] Id. at §

[17] Id. at §,

[18] Id. at §

[19] U.S. v. Hebshie, No. 02CR10185-NG, 2010 WL 4722040 (D. Mass. Nov. 15, 2010).

[20] Id. at *4.

[21] National Research Council, Strengthening Forensic Science in the United States: A Path Forward 1-6 – 1-7 (The National Academies Press 2009), available at http://www.nap.edu/catalog.php?record_ id=12589.

[22] Id. at 1-7.

[23] Id.at 5-10. 

[24] Id. at 5-13.

[25] Id. at 5-9.

[26] National Research Council, Strengthening Forensic Science in the United States: A Path Forward 4-10 (The National Academies Press 2009), available at http://www.nap.edu/catalog.php?record_ id=12589.

[27] Id. at 5-12.

[28] Id. (citing J.L. Mnookin. 2008. The validity of latent fingerprint identification: Confessions of a fingerprinting moderate. Law, Probability and Risk 7:19.).

[29] Id. at 5-11 – 5-12 (citing J.L. Mnookin. 2008. The validity of latent fingerprint identification: Confessions of a fingerprinting moderate.  Law, Probability and Risk 7:127).

[30] Id. at 1-7 (citing State of Maryland v. Bryan Rose. In the Circuit Court for Baltimore County. Case No. K06-545).