Forty-six years ago the National Institute of Law Enforcement and Criminal Justice awarded a grant to the RAND corporation that would have a profound impact on future of the United States criminal justice system. The purpose was to conduct a nationwide study of criminal investigation practices involving crimes against persons. This appears to be the first major study of its kind in the U.S., using empirical evidence to measure the effectiveness of police detective work. The study, which was widely circulated at the time, estimated that merely 3% of all convictions were the result of traditional cold case investigative practices. The study found that most convictions came from either the initial police response, or later by the public contacting law enforcement with new information (full report, summary).
In order to improve cold case conviction rates the final report made recommendations that included requiring more responsibility on the initial responding officer, standardized data collection, case screening to manage investigative backlogs, improved training, and increased reliance on physical evidence to identify suspects. This critique appears to have been controversial at the time (described here by Greenwood), but no-doubt was a catalyst for reform over subsequent decades. These reforms, combined with advances in information technology, played a major role in taming the high crime rates of the 1970s and 1980s (Chapter 6, NIJ Fingerprint Sourcebook).
This movement also presumably triggered a surge in forensic science and biometric technology development. The proliferation of Automated Fingerprint Identification Systems (AFIS) across the U.S. in the 1980s, the creation of the FBI’s IAFIS in the 1990s, and its upgrade to Next Generation Identification (NGI) more recently, all illustrate the advances in biometric technology to support this direction. These advances, combined with the migration toward strict adherence to ACE-V methods by examiners, have shaped the way fingerprint information is consumed and analyzed by law enforcement. Advancements in DNA profiling have had just as much impact. A general DNA timeline can be found here. The FBI’s national CODIS database was created in 1990, and now amplification technology has become so sensitive that it can reliably discriminate sources from a mere touch. However, that level of sensitivity also causes problems interpreting DNA mixtures.
The cycle has now turned to forensic science administrators to improve the reliability of conclusions and to standardize. Much like the pressure law enforcement administrators felt after the 1973 RAND study, the National Academies published a report in 2009 critical of many forensic science methods employed today. A 10-year review of the report by the Innocence Project can be found here. In the wake of the NAS report the National Commission on Forensic Science was formed and spawned the standardization bodies of the Organization of Scientific Area Committees (which replaced the Scientific Working Groups previously organized by the Department of Justice) and the Academy Standards Board of the AAFS. Those bodies are actively working to publish standards.
What does this all mean? I predict that through standardization forensic science and more specifically latent print examination will become stronger and more reliable. I believe the challenge ahead of us will be for the industry to automate solutions to many of the burdens placed on forensic technicians as a result of this standardization.
Going forward this Blog will explore ideas to improve forensic science through automation, and to also help solve day to day technology issues. If you have an idea for a post or would like to contribute your own please contact me at mike@appliedforensicservices.com.
–Mike French