Justice Department officials have known for years that flawed forensic work might have led to the convictions of potentially innocent people, but prosecutors failed to notify defendants or their attorneys even in many cases they knew were troubled.At the federal level, the decision whether to notify defendants in old cases of possible forensic errors resides with prosecutors. Perhaps predictably, among US Attorneys, "The Post found that while many prosecutors made swift and full disclosures, many others did so incompletely, years late or not at all." We see the same types of disparities in Texas state courts. In some instances, as in El Paso, the DA promptly notifies the defense bar when problems are discovered. In others - the Harris County BAT van fiasco comes to mind - prosecutors can be intensely reluctant to open such a can of worms.
Officials started reviewing the cases in the 1990s after reports that sloppy work by examiners at the FBI lab was producing unreliable forensic evidence in court trials. Instead of releasing those findings, they made them available only to the prosecutors in the affected cases, according to documents and interviews with dozens of officials.
In addition, the Justice Department reviewed only a limited number of cases and focused on the work of one scientist at the FBI lab, despite warnings that problems were far more widespread and could affect potentially thousands of cases in federal, state and local courts.
As a result, hundreds of defendants nationwide remain in prison or on parole for crimes that might merit exoneration, a retrial or a retesting of evidence using DNA because FBI hair and fiber experts may have misidentified them as suspects.
In one Texas case, Benjamin Herbert Boyle was executed in 1997, more than a year after the Justice Department began its review. Boyle would not have been eligible for the death penalty without the FBI’s flawed work, according to a prosecutor’s memo.
The Post offers detailed descriptions demonstrating the long slow trajectory of truth when it comes to correcting inaccurate forensics, starting with the first World Trade Center bombing in 1993. According to a related Post article ("DOJ review of flawed forensic processes lacked transparency," April 17):
The bombshell came at the most inopportune time.The main flawed forensic method prompting the Post series involved microscopic hair analysis, a forensic method this blog has criticized in the past as little more than junk science. The stories also detail how those flaws have been known, and ignored, in law-enforcement circles for quite some time:
An FBI special agent was testifying in the government’s high-profile terrorism trial against Omar Abdel Rahman, the “blind sheik” suspected of plotting the first attack on the World Trade Center.
Frederic Whitehurst, a chemist and lawyer who worked in the FBI’s crime lab, testified that he was told by his superiors to ignore findings that did not support the prosecution’s theory of the bombing.
“There was a great deal of pressure put upon me to bias my interpretation,” Whitehurst said in U.S. District Court in New York in 1995.
Even before the Internet, Whitehurst’s extraordinary claim went viral. It turned out he had written or passed along scores of memos over the years warning of a lack of impartiality and scientific standards at the famed lab that did the forensic work after the World Trade Center attack and in other cases.
With the FBI under fire for its handling of the 1993 trade center attack, the Oklahoma City bombing and the O.J. Simpson murder case, officials had to act.
After the Justice Department’s inspector general began a review of Whitehurst’s claims, Attorney General Janet Reno and FBI Director Louis J. Freeh decided to launch a task force to dig through thousands of cases involving discredited agents, to ensure that “no defendant’s right to a fair trial was jeopardized,” as one FBI official promised at a congressional hearing.
The task force took nine years to complete its work and never publicly released its findings. Not the results of its case reviews of suspect lab work. Not the names of the defendants who were convicted as a result. And not the nature or scope of the forensic problems it found.
Those decisions more than a decade ago remain relevant today for hundreds of people still in the U.S. court system, because officials never notified many defendants of the forensic flaws in their cases and never expanded their review to catch similar mistakes.
In 1974, researchers acknowledged that visual comparisons are so subjective that different analysts can reach different conclusions about the same hair. The FBI acknowledged in 1984 that such analysis cannot positively determine that a hair found at a crime scene belongs to one particular person.So as it turned out, when problematic forensic testimony was found to have been used to secure convictions, most federal prosecutors sat on the news rather than turn over the new evidence to defense counsel in affected cases. A handful of actual innocence cases have already arisen from those identified.
In 1996, the Justice Department studied the nation’s first 28 DNA exonerations and found that 20 percent of the cases involved hair comparison. That same year, the FBI lab stopped declaring matches based on visual comparisons alone and began requiring DNA testing as well.
Yet examples of FBI experts violating scientific standards and making exaggerated or erroneous claims emerged in 1997 at the heart of the FBI lab’s worst modern scandal, when Bromwich’s investigation found systematic problems involving 13 agents. The lab’s lack of written protocols and examiners’ weak scientific qualifications allowed bias to influence some of the nation’s highest-profile criminal investigations, the inspector general said.
From 1996 through 2004, a Justice Department task force set out to review about 6,000 cases handled by the 13 discredited agents for any potential exculpatory information that should be disclosed to defendants. The task force identified more than 250 convictions in which the agents’ work was determined to be either critical to the conviction or so problematic — for example, because a prosecutor refused to cooperate or records had been lost — that it completed a fresh scientific assessment of the agent’s work. The task force was directed to notify prosecutors of the results.
Here in Texas, the Post reported, flawed evidence in one case came to light only after the defendant was executed. Wrote Hsu:
In Texas, the review of Benjamin Herbert Boyle’s case got underway only after the defendant was executed, 16 months after the task force was formed, despite pledges to prioritize death penalty cases.There was other inculpatory evidence in Boyle's case - most prominently a fingerprint on the duct tape used to bind the victim - but the example further demonstrates, if more evidence were needed, the common use of uncertain forensic methods even (perhaps especially) in high stakes cases.
Boyle was executed six days after the Bromwich investigation publicly criticized [Michael] Malone, the FBI agent who worked on his case, but the FBI had acknowledged two months earlier that it was investigating complaints about him.
The task force asked the Justice Department’s capital-case review unit to look over its work, but the fact that it failed to prevent the execution was never publicized.
This question of how authorities should respond when flawed forensics are discovered will continue to come up, and not just at the federal level. Here in Texas when flawed arson science was uncovered in the Todd Willingham and Ernest Willis cases, the Forensic Science Commission partnered with my employers at the Innocence Project of Texas and the state fire marshal to reexamine old arson cases for possible false convictions. And when the El Paso crime lab was found to be employing an incompetent analyst, DA Jaime Esparza was praised for notifying local defense counsel. OTOH, when Texas appellate courts decided dog-sniff lineups weren't good enough evidence to support a conviction, nobody in officialdom ever tried to identify the 2,000 cases where Deputy Keith Pikett claimed to have performed the technique, much less instances where flawed testimony may have been central to securing a conviction. And the Texas Court of Criminal Appeals has generally refused to grant habeas writs based on debunked scientific testimony, even when it was central to securing a conviction. So as a practical matter Texas' response to flawed forensics, while superior in some cases to the feds', has overall been rather hit or miss.
Here are links to the stories in the Post's forensics package:
- Convicted defendants left uninformed of forensics flaws by Justice Dept.
- DOJ review of flawed forensics processes lacked transparency
- Forensic techniques are subject to human bias, lack standards, panel found
- D.C. man served 28 years. Then the evidence that sent him to prison fell apart.
- How accurate is forensic analysis?
Tidak ada komentar:
Posting Komentar