It’s one of those nightmare situations you hope you never find yourself in: sitting on death row for a crime you didn’t commit. In 1977 that nightmare became a reality for Randall Dale Adams, thanks to the testimony of Texas forensic psychiatrist James Grigson.
Despite doubt concerning Adams’ guilt in the murder of a Dallas police officer, Grigson told the jury that Adams would be an ongoing menace if kept alive, and he was subsequently sentenced to death.
Publicity surrounding the 1988 documentary “The Thin Blue Line” did much to warrant a review of Adams’ case, resulting in the overturning of his conviction and his release from prison in 1989. Six years later, Grigson was found guilty of unethical conduct and was expelled by the American Psychiatric Association and the Texas Society of Psychiatric Physicians.
Grigson was also known by another name: Dr. Death. Throughout his career, he developed a reputation for serving as an expert witness, but only for the side of the prosecution. When all was said and done, he had testified in 167 capital trials, nearly all of which resulted in death sentences.
Such implicit bias, for whatever reason — be it monetary gain or even anger at the defendant and desire for retribution — is “very common,” according to Arizona State University assistant professor Tess Neal.
Herself a licensed psychologist, Neal recently received the Saleem Shah Award for Early Career Excellence in Psychology and Law for her interdisciplinary research blending psychology, ethics and law to understand how people reach decisions in the legal system. Specifically, she has analyzed how the biases of forensic experts inform their testimony and, therefore, the decisions made by judges, lawyers and other members of the courts.
“As a professional psychologist, you’re ethically obligated to not be influenced by your own [biases],” said Neal. “You’re supposed to know that they’re there, and then account for them. We know that that’s really hard to do, and in fact, probably impossible. But the clinicians are trained to think they can do it, and that they have to be objective, so they’re really invested in it.”
Unfortunately, bias isn’t always as obvious as in the case of Grigson — even to the experts themselves.
Because the American legal system is designed to be adversarial — that is, there are two sides to every case, and it is the job of each side to present its best argument based on the evidence — attorneys are ethically obligated to advocate for whichever side they happen to be on.
“This is where science and law clash,” said Neal, citing a recent study involving forensic experts. In the study, participants were asked to evaluate evidence in a hypothetical case. Before doing so, they were informed of which side — prosecution or defense — they were hired by. The result was that their evaluation of the evidence changed based on which side they were told they were working for.
What that shows, Neal explained, is that even trained forensic experts “can get subtly absorbed into that adversarial way of thinking.” And that biases the way they perceive information, interpret data and reach opinions.
“This is a touchy subject, and I don’t want to alienate myself from my field … but they’re human beings. … It’s impossible not to be impacted by the limitations of human cognition,” she said.
However, she also believes it’s possible to reduce the likelihood of bias, which is her ultimate goal:
“I hope with this body of research, and with my career, that I can help clinicians — this group of which I am a member — to understand the limitations of what we bring and not be overconfident in how objective we are, and not be overconfident in the opinions that we provide to the court, and try to stick with the science and not go beyond science.”
Neal made a big step toward that goal with the publication of the paper “Forensic Psychologists' Perceptions of Bias and Potential Correction Strategies in Forensic Mental Health Evaluations” in the journal Psychology, Public Policy, and Law.
In the paper, she and co-author Stanley L. Brodsky break down 25 methods forensic experts use to control their bias into four categories: things people say they do that actually work and that science says probably do; things people say they do that science hasn’t yet tested; things people say they do that science says don’t work and may actually make bias worse; and other strategies not recognized by forensic clinicians.
Identifying and evaluating current methods is just part of the process. “We still have some work to do,” Neal conceded — namely, testing and proposing newer, better methods. One method that she finds promising is known as “blinding procedures,” where the forensic expert hired to evaluate evidence isn’t told which side of the case their findings will be used for. The result is a more independent, neutral opinion, and many forensic labs are adopting this practice.
Also working in Neal’s favor is that fact that there is currently a lot of federal support for reducing the margin of error in such forensic methods as fingerprint and blood-spatter analysis. She hopes to take advantage of that by advocating for the same kind of support for reducing the likelihood of bias in forensic psychology. She’ll soon have the chance to do so on a higher level when she speaks on the topic at the American Psychology-Law Society annual conference March 10-12 in Atlanta.
Until then, Neal is enjoying teaching a course on forensic psychology at ASU. The class recently observed footage from the trial of infamous serial killer Jeffrey Dahmer, which included the testimony of no less than seven mental-health experts, all with varying opinions as to Dahmer’s mental state.
“I think it’s an interesting class,” said Neal. “But I’m biased.”