The Economist argues that even the most powerful forensic tools, including DNA can be tainted by "cognitive bias" when scientists are given too much "contextual information" about the case, citing a study where DNA analysts unfamiliar with case details were less likely to find a match than the original examiners who knew the case details. The magazine grants that:
one example does not prove the existence of a systematic problem. But it does point to a sloppy approach to science. According to Norah Rudin, a forensic-DNA consultant in Mountain View, California, forensic scientists are beginning to accept that cognitive bias exists, but there is still a lot of resistance to the idea, because examiners take the criticism personally and feel they are being accused of doing bad science. According to Dr Rudin, the attitude that cognitive bias can somehow be willed away, by education, training or good intentions, is still pervasive.
Medical researchers, by contrast, take great care to make drug trials “blind”, so that neither the patient nor the administering doctor knows who is receiving the drug being tested, and who is getting a control drug or placebo. When someone’s freedom—and, in an American context, possibly his life, as well—is at stake, it surely behooves forensic-science laboratories to take precautions that are equally strong.
Blind administration turned out to be a key reform for eyewitness identification, and your correspondent has long believed the same approach is justified in other forensic disciplines. Why does a DNA analyst need to know case details before deciding if two samples match? Not only is it irrelevant to the analysis, it may actually turn out to reduce its accuracy.
Tidak ada komentar:
Posting Komentar