Hostname: page-component-78c5997874-t5tsf Total loading time: 0 Render date: 2024-11-20T03:31:23.023Z Has data issue: false hasContentIssue false

Forensic Science

Published online by Cambridge University Press:  01 January 2021

Extract

The United States Supreme Court has long recognized the value of scientific evidence - especially when compared to other types of evidence such as eyewitness identifications, confessions, and informant testimony. For example, in Escobedo v. Illinois, the Court observed: “We have learned the lesson of history, ancient and modern, that a system of criminal law enforcement which comes to depend on the ‘confession’ will, in the long run, be less reliable and more subject to abuses than a system which depends on extrinsic evidence independently secured through skillful investigation.” Similarly, in Davis v. Mississippi, the Court commented:

Detention for fingerprinting may constitute a much less serious intrusion upon personal security than other types of police searches and detentions. Fingerprinting involves none of the probing into an individual's private life and thoughts that marks an interrogation or search. Nor can fingerprint detention be employed repeatedly to harass any individual, since the police need only one set of each person's prints. Furthermore, fingerprinting is an inherently more reliable and effective crime-solving tool than eyewitness identifications or confessions and is not subject to such abuses as the improper line-up and the “third degree.”

Type
Article
Copyright
Copyright © American Society of Law, Medicine and Ethics 2005

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

378 U.S. 478 (1964).Google Scholar
Id. at 488–89.Google Scholar
394 U.S. 721 (1969).Google Scholar
Id. at 727.Google Scholar
See Thompson, W. C., “Evaluating the Admissibility of New Genetic Identification Tests: Lessons from the ‘DNA War,’” Journal of Criminal Law & Criminology 84 (1993): 22104.CrossRefGoogle Scholar
545 N.Y.S.2d 985 (Sup. Ct. 1989).Google Scholar
Id. at 996.Google Scholar
Lander, E. S. Budowle, B., “DNA Fingerprinting Dispute Laid to Rest,” Nature 371 (1994): 735738, at 735.CrossRefGoogle Scholar
42 U.S.C. § 14131(1)(a), (c) (2000). The DNA Advisory Board has expired. Currently, the FBI DNA Quality Assurance Standards govern DNA laboratories that receive federal funding. These standards require periodic external audits to ensure compliance with the required quality assurance standards.Google Scholar
National Research Council, DNA Technology in Forensic Science (Washington: National Academy Press, 1992): At 55.Google Scholar
See Saks, M. J. Koehler, J. J., “What DNA ‘Fingerprinting’ Can Teach the Law about the Rest of Forensic Science,” Cardozo Law Review 13 (1991): 361–72.Google Scholar
509 U.S. 579 (1993). 13.522 U.S. 136 (1997).Google Scholar
526 U.S. 137 (1999).Google Scholar
Daubert, 509 U.S. at 590 (“[I]n order to qualify as ‘scientific knowledge,’ an inference or assertion must be derived by the scientific method. Proposed testimony must be supported by appropriate validation – i.e., ‘good grounds,’ based on what is known. In short, the requirement that an expert's testimony pertain to ‘scientific knowledge’ establishes a standard of evidentiary reliability”).Google Scholar
Id. at 592–93.Google Scholar
528 U.S. 440 (2000).Google Scholar
Id. at 455.Google Scholar
United States v. Hines, 55 F. Supp. 2d 62, 67 (D. MA. 1999).Google Scholar
See, e.g., United States v. Prime, 363 F.3d 1028, 1033 (9th Cir. 2004) (admitting evidence); United States v. Crisp, 324 F.3d 261, 263 (4th Cir. 2003); United States v. Jolivet, 224 F.3d 902, 905–06 (8th Cir. 2000); United States v. Lewis, 220 F. Supp. 2d 548, 554 (SD, WV, 2002); United States v. Saelee, 162 F. Supp. 2d 1097, 1103 (D. Alaska 2001) (“There is little known about the error rates of forensic document examiners. The little testing that has been done raises serious questions about the reliability of methods currently in use. As to some tasks, there is a high rate of error and forensic document examiners may not be any better at analyzing handwriting than laypersons.”).Google Scholar
See Williamson v. Reynolds, 904 F. Supp. 1529, 1558 (E.D. OK, 1995) (“This court has been unsuccessful in its attempts to locate any indication that expert hair comparison testimony meets any of the requirements of Daubert”), rev’d on this issue, Williamson v. Ward, 110 F.3d 1508, 1522–23 (10th Cir. 1997) (due process, not Daubert, standard applies in habeas proceedings).Google Scholar
See United States v. Mitchell, 365 F.3d 215, 246 (3d Cir. 2004) (admitting evidence); United States v. Crisp, 324 F.3d 261, 263 (4th Cir. 2003); United States v. Sullivan, 246 F. Supp. 2d 700, 703–04 (E.D. KY, 2003); United States v. Llera Plaza, 188 F. Supp. 2d 549, 571 (E.D. PA, 2002) (excluding and then admitting fingerprint evidence).Google Scholar
See United States v. Foster, 300 F. Supp. 2d 375, 376 (D. MD, 2004) (“In this case, the testimony of Supervisory Special Agent Paul Tangren, a Firearms Tool Marks Examiner with the Federal Bureau of Investigation, established to the court's satisfaction the general reliability of the science of ballistics, including comparisons of spent cartridge casings even where there is no ‘known’ weapon recovered”); United States v. Santiago, 199 F. Supp. 2d 101, 110–12 (S.D.N.Y. 2002) (holding “ballistics” evidence satisfies Daubert standard).Google Scholar
See Pretty, I. A. Sweet, D., “The Scientific Basis for Human Bitemark Analyses – A Critical Review,” Science & Justice 41 (2001): 8592, at 86 (“Despite the continued acceptance of bitemark evidence in European, Oceanic and North American Courts, the fundamental scientific basis for bitemark analysis has never been established”).CrossRefGoogle Scholar
See United States v. Horn, 185 F. Supp. 2d 530, 549 (D. MD, 2002).Google Scholar
In Frye v. United States, 293 F. 1013 (D.C. Cir. 1923), the D.C. Circuit considered the admissibility of polygraph evidence as a case of first impression. In rejecting the evidence, the court wrote: “Just when a scientific principle or discovery crosses the line between the experimental and demonstrable stages is difficult to define. Somewhere in this twilight zone the evidential force of the principle must be recognized, and while the courts will go a long way in admitting expert testimony deduced from a well recognized scientific principle or discovery, the thing from which the deduction is made must be sufficiently established to have gained general acceptance in the particular field in which it belongs.” Id. at 1014. See generally Giannelli, P. C., “The Admissibility of Novel Scientific Evidence: Frye v. United States, A Half-Century Later,” Columbia Law Review 80 (1980): 11971250.CrossRefGoogle Scholar
See, e.g., People v. Shreck, 22 P.3d 68, 70 (Colo. 2001); Schafersman v. Agland Coop., 631 N.W.2d 862, 867 (NE, 2001). See Giannelli, P. C. Imwinkelried, E. J., Scientific Evidence (Charlottesville, VA: Lexis Law Publishing, 3d ed., 1999): § 1–13.Google Scholar
See, e.g., People v. Leahy, 882 P.2d 321, 323 (CA. 1994); People v. Miller, 670 N.E.2d 721, 731 (Ill. 1996); Burral v. State, 724 A.2d 65, 80 (Md. 1999); Goeb v. Tharaldson, 615 N.W.2d 800, 814 (MN, 2000). See Giannelli, Imwinkelried, , supra note 27, § 1–15.Google Scholar
See Ramirez v. State, 810 So. 2d 836, 843 (FL, 2001); State v. Copeland, 922 P.2d 1304, 1314 (WA, 1996) (en banc).Google Scholar
See Ramirez v. State, 810 So. 2d 836, 844 (FL, 2001).Google Scholar
Scheck, B. et al., Actual Innocence: Five Days to Execution and Other Dispatches from the Wrongly Convicted (New York: Doubleday, 2000): At 246.Google Scholar
In re Investigation of the W. Va. State Police Crime Lab., Serology Div., 438 S.E.2d 501, 503 (WV, 1993) (quoting report).Google Scholar
According to Zain's replacement, “several prosecutors expressed dissatisfaction with the reports they were receiving from serology and specifically requested that the evidence be analyzed by Zain.” Id. at 513 n.16 (deposition of Ted Smith). “[Serologist] Myers also testified that after he had been unable to find blood on a murder suspect's jacket, it was sent to Texas, where Zain found a bloodstain which tested consistent with the blood of the victim.” Id. at 512. “[Serologist] Bowles also testified that at least twice after Zain left the lab, evidence on which Bowles had been unable to obtain genetic markers was subsequently sent to Texas for testing by Zain, who again was able to identify genetic markers.” Id.Google Scholar
See Giannelli, P. C., “The Abuse of Scientific Evidence in Criminal Cases: The Need for Independent Crime Laboratories,” Virginia Journal of Social Policy & Law 4 (1997): 439–78.Google Scholar
Ramirez v. State, 810 So. 2d 836, 853 (FL, 2001) (citations omitted).Google Scholar
Office of Inspector General, U.S. Department of Justice, The FBI Laboratory: An Investigation into Laboratory Practices and Alleged Misconduct in Explosives-Related and Other Cases (April, 1997).Google Scholar
42 U.S.C. § 263a (2000).Google Scholar
See Lander, E. S., “DNA Fingerprinting On Trial,” Nature 339 (1989): 501–05, at 505 (“At present, forensic science is virtually unregulated – with the paradoxical result that clinical laboratories must meet higher standards to be allowed to diagnose strep throat than forensic labs must meet to put a defendant on death row.”); Jonakait, R. N., “Forensic Science: The Need for Regulation,” Harvard Journal of Law & Technology 4 (1991): 109–91, at 191 (“Current regulation of clinical labs indicates that a regulatory system can improve crime laboratories”).CrossRefGoogle Scholar
See N.Y. Exec. Law § 995b (McKinney 2003) (requiring accreditation by state Forensic Science Commission); Okla. Stat. 74 § 150.37 (2004) (requiring accreditation by the American Society of Crime Laboratory Directors/Laboratory Accreditation Board or the American Board of Forensic Toxicology); Tex. Crim. Proc. Code § 38.35 (2004) (requiring accreditation by the Department of Public Safety).Google Scholar
See Smith v. State, 702 N.E.2d 668, 673 (IN, 1998) (DNA) (“[T]he lab was accredited by the American Society of Crime Lab Directors in 1990. Furthermore, the lab runs its tests under controlled conditions, follows specific protocols, and conducts quality testing on the kits and the analysts. Any concerns in this respect go to the weight of the evidence, not its admissibility”).Google Scholar
Jones, G. R., “President's Editorial – The Changing Practice of Forensic Science,” Journal of Forensic Science 47 (2002): 437–38, at 438.CrossRefGoogle Scholar
Peterson, J. L. et al., Crime Laboratory Proficiency Testing Research Program (1978).Google Scholar
See Peterson, J. L. Markham, P. N., “Crime Laboratory Proficiency Testing Results, 1978–1991, Parti: Identification and Classification of Physical Evidence,” Journal of Forensic Science 40 (1995): 9941008; Peterson, J. L. Markham, P. N., “Crime Laboratory Proficiency Testing Results, 1978–1991, Part II: Resolving Questions of Common Origin,” Journal of Forensic Science 40 (1995): 1009–29.CrossRefGoogle Scholar
National Research Council, The Evaluation of Forensic DNA Evidence (Washington: National Academy Press 1996): At 88 (“Laboratories should participate regularly in proficiency tests, and the results should be available for court proceedings”).Google Scholar
42 U.S.C. § 14132(b)(2) (2000) (external proficiency testing for CODIS participation); id. § 1433(a)(1)(A) (FBI). A recent study of blind DNA proficiency testing raised some questions about the cost and feasibility of this type of testing, as well as its effectiveness when compared to other methods of quality assurance such as more stringent external case audits. Peterson, J. L. et al., “The Feasibility of External Blind DNA Proficiency Testing: Part I: Background and Findings,” Journal of Forensic Science 48 (2003): 2131; Part II: “Experience with Actual Blind Tests,” id. 32–40.Google Scholar
United States v. Llera Plaza, 188 F. Supp. 2d 549, 565 (E.D. PA. 2002). See also United States v. Crisp, 324 F.3d 261, 274 (4th Cir. 2003) (Michael, J., dissenting) (“Proficiency testing is typically based on a study of prints that are far superior to those usually retrieved from a crime scene”).Google Scholar
United States v. Lewis, 220 F. Supp. 2d 548, 554 (SD, WV, 2002). See also United States v. Crisp, 324 F.3d 261, 279 (4th Cir. 2003) (Michael, J., dissenting) (“Moreover, although the government's expert here testified to his success on proficiency tests, the government provides no reason for us to believe that these tests are realistic assessments of an examiner's ability to perform the tasks required in his field. See J. A. 342 [testimony of the government's handwriting expert that he has always achieved a perfect score on proficiency tests]”).Google Scholar
Daubert, 509 U.S. at 594.Google Scholar
National Research Council, supra note 10 (“Each DNA typing procedure must be completely described in a detailed, written laboratory protocol”).Google Scholar
DNA Advisory Board Standards 9, 10 & 12 (1998).CrossRefGoogle Scholar
See Stauffer, E. Lentini, J. J., “ASTM Standards for Fire Debris Analysis: A Review,” Forensic Science International 132 (2003): 6367.CrossRefGoogle Scholar
President's Commission on Law Enforcement and Administration of Justice, The Challenge of Crime in a Free Society (1967): At 255.Google Scholar
National Advisory Commission on Criminal Justice Standards and Goals, Report on Police (1974): At 304.Google Scholar
Guillen, T. Nalder, E., “Overwhelming Evidence: Crime Labs in Crisis,” Seattle Times, June 19, 1994, at A1, A14.Google Scholar
Beaupre, B., “Crime Labs Staggering Under Burden of Proof,” USA Today, August 20, 1996, at 1.Google Scholar
42 U.S.C. § 3797j-o (2000).Google Scholar
Daubert, 509 U.S. at 590.Google Scholar
DAB Standard 2 (ff) (“Validation is a process by which a procedure is evaluated to determine its efficacy and reliability for forensic casework analysis and includes: (1) Developmental validation is the acquisition of test data and determination of conditions and limitations of a new or novel DNA methodology for use on forensic samples; (2) Internal validation is an accumulation of test data within the laboratory to demonstrate that established methods and procedures perform as expected in the laboratory”). SWIGDAM promulgated revised validation guidelines in 2003. FBI, Forensic Science Communications 6 (July, 2004).Google Scholar
National Research Council, On the Theory and Practice of Voice Identification (Washington: National Academy Press 1979).Google Scholar
National Research Council, supra note 44; National Research Council, supra note 10.Google Scholar
See National Research Council, The Polygraph and Lie Detection (Washington: National Academy Press 2002).Google Scholar
National Research Council, Forensic Analysis: Weighing Bullet Lead Evidence (Washington: National Academy Press, 2004).Google Scholar
Commentary, ABA Standards for Criminal Justice, Providing Defense Services 5–1.4 (3d ed. 1992): At 22.Google Scholar
Reilly v. Berry, 166 N.E. 165, 167 (NY, 1929).Google Scholar
United States v. Johnson, 238 F.2d 565, 572 (2d Cir. 1956) (dissent), vacated, 352 U.S. 565 (1957).Google Scholar
ABA Standards for Criminal Justice, supra note 65, Standard 5–1.4.Google Scholar
ABA Standards for Criminal Justice, Prosecution Function and Defense Function 3–2.4 (b) (3d ed. 1993).Google Scholar
18 U.S.C. § 3006(A)(e) (2000).Google Scholar
Weinstein, J. B., “Science, and the Challenge of Expert Testimony in the Courtroom,” Oregon Law Review 77 (1998): 1005–17, at 1008.Google Scholar
See Ill. Comp. Stat. 725 § 5-113-3(d). See also MN. Stat. § 611.21(b) ($1,000 maximum).Google Scholar
470 U.S. 68 (1985).Google Scholar
Id. at 76.Google Scholar
See, e.g., Ex parte Grayson, 479 So. 2d 76, 82 (AL, 1985) (“[T]here is nothing contained in the Ake decision to suggest that the United States Supreme Court was addressing anything other than psychiatrists and the insanity defense”); Isom v. State, 488 So. 2d 12, 13 (AL, Crim. App. 1986) (“Ake does not reach noncapital cases”).Google Scholar
See Giannelli, P. C., “Ake v. Oklahoma: The Right to Expert Assistance in a Post-Daubert, Post-DNA World,” Cornell Law Review 89 (2004): 13051419.Google Scholar
See Moore v. Kemp, 809 F.2d 702, 712 (11th Cir. 1987) (en banc) (“[A] defendant must show the trial court that there exists a reasonable probability both [1] that an expert would be of assistance to the defense and [2] that denial of expert assistance would result in a fundamentally unfair trial.”) (emphasis added).Google Scholar
State v. Moore, 364 S.E.2d 648, 657 (NC, 1988).Google Scholar
See Weinstein, , supra note 72, at 1008 (“Courts, as gatekeepers, must be aware of how difficult it can be for some parties – particularly indigent criminal defendants – to obtain an expert to testify. The fact that one side may lack adequate resources with which to fully develop its case is a constant problem”).Google Scholar
Coyle, M. et al., “Trial and Error in the Nation's Death Belt: Fatal Defense,” National Law Journal, June 11, 1990, at 30.Google Scholar
Id. at 40.Google Scholar
Id. at 38.Google Scholar
Hanson, R., Indigent Defenders: Get the Job Done and Done Well (1992): At 100 (study by the National Center for State Courts).Google Scholar
See “A Study of Representation in Capital Cases in Texas,” Texas Bar Journal 56 (1993): 333, at 408.Google Scholar
See Fed. R. Crim. P. 16 (1975) advisory committee's note (“[I]t is difficult to test expert testimony at trial without advance notice and preparation”), reprinted at 62 F.R.D. 312 (1974); Giannelli, P. C., “Criminal Discovery, Scientific Evidence, and DNA,” Vanderbilt Law Review 44 (1991): 791825.Google Scholar
National Research Council, supra note 10, at 146. The 1996 DNA report contains the following statement on discovery: “Certainly, there are no strictly scientific justifications for withholding information in the discovery process, and in Chapter 3 we discussed the importance of full, written documentation of all aspects of DNA laboratory operations. Such documentation would facilitate technical review of laboratory work, both within the laboratory and by outside experts. … Our recommendations that all aspects of DNA testing be fully documented is most valuable when this documentation is discoverable in advance of trial.” National Research Council, supra note 44, at 167–68.Google Scholar
State v. Tankersley, 956 P.2d 486, 495 (AZ, 1998).Google Scholar
Fed. R. Civ. P. 26(a)(2).Google Scholar
Fed. R. Civ. P. 26(a)(2)(B).Google Scholar
Fed. R. Civ. P. 26(b)(4)(A).Google Scholar
Wright, C. A. Miller, A. R., Federal Practice and Procedure 2 (St. Paul, MN: West Publishing Co. 2d ed., 1982): § 252, at 36–37.Google Scholar
Commentary, ABA Standards for Criminal Justice Relating to Discovery and Procedure Before Trial 67 (Approved Draft 1970).Google Scholar
LaFave, W. R. et al., Criminal Procedure (St. Paul, MN: West Publishing Co. 2d ed., 1999): § 20.3(f), at 861 (“Once the report is prepared, the scientific expert's position is not readily influenced, and therefore disclosure presents little danger of prompting perjury or intimidation”).Google Scholar
See Williams v. Florida, 399 U.S. 78 (1970).CrossRefGoogle Scholar
Wardius v. Oregon, 412 U.S. 470, 474 (1973). See also Ross v. Moffitt, 417 U.S. 600, 609 (1974) (“‘Due process’ emphasizes fairness between the State and the individual dealing with the State …”); Lassiter v. Department of Social Servs., 452 U.S. 18, 28 (1981) (The “adversary system presupposes accurate and just results are most likely to be obtained through the equal contest of opposed interests …”).Google Scholar
ABA Standards for Criminal Justice, Discovery Standard 11–2.1(a)(iv) (3d ed. 1996).Google Scholar
See Unif. R. Crim. P. 421(a) (Approved Draft 1974) (“expert reports”).Google Scholar
Peterson, J. L., “Symposium: Ethical Conflicts in the Forensic Sciences,” Journal of Forensic Science 34 (1989): 717–93.CrossRefGoogle Scholar
Lucas, D. M., “The Ethical Responsibilities of the Forensic Scientist: Exploring the Limits,” Journal Forensic Science 34 (1989): 719–29, 724. Lucas was the Director of The Centre of Forensic Sciences, Ministry of the Solicitor General, Toronto, Ontario.CrossRefGoogle Scholar
Harrison, A. (Professor, Mount Holyoke College), “Symposium on Science and the Rules of Legal Procedure,” 101 F.R.D. 599, 632 (1984).Google Scholar
667 F. Supp. 1456 (S.D. FL. 1986), aff’d, 828 F.2d 670 (11th Cir. 1987).Google Scholar
Id. at 1458.Google Scholar
Id. at 1459 (“Next, as Mr. Riley candidly admitted in his deposition, he was ‘pushed’ further in his analysis at Troedel's trial than at Hawkins’ trial. Furthermore, at the March 26th evidentiary hearing held before this Court, one of the prosecutors testified that, at Troedel's trial, after Mr. Riley had rendered his opinion which was contained in his written report, the prosecutor pushed to ‘see if more could have been gotten out of this witness.’ When questioned why, in the Hawkins trial, he did not use Mr. Riley's opinion that Troedel had fired the weapon, the prosecutor responded he did not know why”).Google Scholar
Id. at 1459–60.Google Scholar
Commentary, ABA Standards for Criminal Justice, Prosecution and Defense Function and Defense Function (3d ed. 1993): At 59.Google Scholar
A comparable standard applies to defense counsel. ABA Standard 4–4.4(a).Google Scholar
ABA Standard 11–2.1(a)(iv), supra note 97.Google Scholar
Fed. R. Crim. P. 16, advisory committee's note, reprinted at 147 F.R.D. at 387.Google Scholar
Williams v. State, 312 S.E.2d 40 (Ga. 1983).CrossRefGoogle Scholar
Deadman, H. A., “Fiber Evidence and the Wayne Williams Trial (Part I),” FBI Law Enforcement Bulletin 53 (March 1984): 1220.Google Scholar
Williams, 312 S.E.2d at 50.Google Scholar
Id. at 100 (Smith, J., dissenting).Google Scholar
Peterson, J. L. et al., “The Use and Effects of Forensic Science in the Adjudication of Felony Cases,” Journal of Forensic Science 32 (1987): 1730–53, 1748.CrossRefGoogle Scholar
Redmayne, M., Expert Evidence and Criminal Justice (Oxford: Oxford University Press, 2001): At 139.CrossRefGoogle Scholar