The Daubert v Merrill Dow Pharmaceuticals decision (i.e., 'Daubert') of the United States Supreme Court, and two related decisions (Joiner and Kuhmo), provide new standards to weigh scientific evidence and methodology in the courtroom, and a signficant new 'gatekeeping' role for judges in carrying out these standards. The accomplishment of applying these standards by the court necessarily relies on the understanding and approach of the legal system. Ironically, in attempting to raise the quality of science in the courtroom, the Court raises the role of non scientists (i.e, judges) in distinguishing between 'good' and 'bad' science. As suggested in the Overview section, this melding of two different disciplines can lead to confusion over terminology and concepts.
Daubert, in superceding the Frye standard of generally accepted by the scientific community, set a number of additional guidelines for the court to use to determine scientific reliability: testable technique or theory; known error rates of technique or theory; and methodology that has been peer reviewed.
Daubert focused on methodology. Joiner extended this to the expert's conclusions in the sense that the methodology had to be appropriate to the conclusions. The court was concerned that many solid methodological approaches were being applied beyond what they were designed for.
Kuhmo extended these guidelines to non scientific disciplines of expertise but in so doing emphasized intellectual rigor of one's approach and accepted the notion that reliability did not only reside in science and that flexibility had to be used with the four Daubert guidelines.
This section looks at how the court understands and applys these scientific standards; compares this to how the scientific community typically understands these concepts and standards; and attempts to separate out a certain melding of scientific concepts and principles into the legal perspective. Suggestions and approaches are offered to clarify these matters and present them in the courtroom in line with scientific thinking and in a way that helps to educate the court.
Study of results of Daubert hearing: how many experts survive, or don't, and why. 'What you don't know about Daubert can hurt you,' 24 Vt. BJ. & L. Dig.51, 53 (1998).
Scientific error rate: what does Daubert requirement actually mean? Explaining error rate, degree of uncertainty and how and when it can be quantified, or not. [see also Michaels D & Monforton C, Am J Pub Health, 95:S39, 2005].
Ten Years of Daubert: effects on admissibility of scientific evidence in courtroom. Results of a survey as to changes of what judges allow or don't, and why.
Scientific standards of (toxic) causation in the courtroom: effects of a decade of Daubert. Do court's consideration of factors make scientific sense? What case analysis shows.
Reference Manual on Scientific Evidence, Federal Judicial Center, 2nd edition.
Weisgram v. Marley Co., 528 U.S. 440, 120 S.Ct. 1011, 1021; 145 L.Ed.2d 958 (2000)
General Electric Co. v. Joiner, 522 U.S. 136; 118 S. CT. 512; 139 L.Ed.2d 508  (expert's methodology versus conclusions)
After Daubert: Havner vs. Merrill Dow. Statistical levels of relative risk that meet a 'more likely than not' legal burden of general causation--scientific strengths and weaknesses of this standard.
Causation in Epidemiology, and how it applies to Daubert standard. Rothman, KJ & Greenland, S, Am J Pub Health 95:S144  'Causation and Causal Inference in Epidemiology'.
More probable than not?
Legal vs scientific standard of proof. Does the 'more probable than not' legal standard have a scientific counterpart appropriate for the courtroom? Or is the only scientific standard the ninety-five percent certainty level? Rulings recently rejecting 'intellectual rigor' test that courtroom testimony must meet same standard as in laboratory.
A look at data requirements by International Agency for Research on Cancer (IARC), a standard setting agency of the the World Health Organization (WHO), relied on by governments around the world, and by the U.S. National Toxicology Program (NTP) in their rating carcinogens as 'known human carcinogens' or 'probable human carcinogens' or 'reasonably anticipated to be carcinogens'--as a reflection of scientific consensus on acceptable causation standards, well below the 95 percent standard. Role of animal data in proof of causation in humans.
Rule 26 report
Data and conclusion requirements; what is not actually required, importance to know difference.
Novel injury-exposure links that cannot rely on existing studies in literature but require a new study for litigation.
Level of verification required to validate study scientifically for a rule 26 when it has not been submitted for publication.
Testimony restricted to a Rule 26 submission: what exactly does that mean and to what extent can tables of data, findings and other visual aids presented to a jury go beyond content of Rule 26? Ways to expand on Rule 26 content while staying within restrictions.
Daubert and recent related decisions
Key Papers and Analysis
What they said, further clarification and remaining questions.
Daubert expert qualification standard (s) in Kuhmo?
Applied differently to scientists versus engineers? Critical distinctions in acceptance of expertise, and how this information can be used in presenting expert.
Kumho Tire Co. v. Carmichael 526 U.S. 137, 152; 119 S. Ct. 1167; 143 L. ED.2d 238  ('intellectual rigor' test: is expert's opinion based on same professional standard as used in other professional activities).
Judging the Judges
How well and consistently do judges carry out their 'gatekeeping' role? How well do they understand science? And is this court activity assessed in any way?