opinion@dailylobo.com
Artful detective Sherlock Holmes once proclaimed, “It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.”
While Holmes and real detectives of generations past were forced to rely on logic and keen observation to crack a case, modern-day sleuthing is transforming into to a real science.
Over the last few decades, innovations in forensic science have raised the standard for courtroom evidence. Eyewitness testimony, personality profiling and other qualitative approaches of the past no longer stand up to trial. DNA evidence in particular has cemented its place as final, often damning proof.
However, the power bestowed upon forensic science is just a little premature. In the last few years forensic scientists and lawyers alike have discovered that our seemingly innocuous forensic technology may in fact be guilty of misleading us.
Last year, the murder of millionaire Raveesh Kumra in his northern California home seemed like an easily solved case — DNA found under Kumra’s fingernails matched that of a stranger, Lukis Anderson. The only problem was that Anderson was in the hospital during the night of the murder.
How did DNA from a man with a solid alibi wind up on a dead body many miles away? Prosecutors eventually came to realize that the same group of paramedics responded to Anderson’s and Kumra’s emergencies on the night of the murder, thereby contaminating the Kumra crime scene with Anderson’s DNA from earlier that night.
Anderson was released from jail, but his five-month long ordeal raises an uneasy question: how many other individuals have been wrongly convicted based on our blind allegiance to forensic evidence?
Suzanne Bell, a forensic chemist at West Virginia University, does not think that forensic science is inherently flawed. She echoes a consensus among the general scientific community that “the fundamental issues…can be solved by fixing the science.”
To address the rising problem of ambiguity in forensic investigation, the U.S. Department of Justice and the National Institute of Standards and Technology assembled the very first U.S. commission on forensic science earlier this year.
The group’s chief aim is to improve reliability of analytical techniques and reduce variability in test results.
Thankfully, these goals are more attainable than ever. More precise technologies are rapidly emerging as news coverage surrounding wrongful imprisonments shine a spotlight on the weaknesses of forensic science.
Get content from The Daily Lobo delivered to your inbox
Most notably, scientific fields such as nanotechnology and artificial intelligence are lending their innovations to forensic applications.
Nanotechnology is uniquely useful in “improving the sensitivity of already existing forensic techniques,” according to Dr. Bruce McCord of Florida International University.
A 2011 study found that even with professional examiners, correct fingerprint matches between a sample and a database were missed 7.5 percent of the time and incorrect matches between pairs occurred .1 percent of the time. These error rates are shockingly high considering the thousands of fingerprint samples a forensics lab encounters in a given year.
Nanotechnology improves fingerprinting accuracy by replacing the typical materials, such as carbon and aluminum flakes, with tiny nanoparticles. This substitution magnifies the sensitivity of development, coaxing out old or faint prints that are traditionally impossible to capture.
Even better, fingerprints now tell a detective more than just your identity. The nanoparticles used for development can also include antibodies that change color when they bind to products of cocaine or tobacco consumption in fingerprint sweat. Besides a clear drug testing application, this information reveals a lot about a crime suspect’s lifestyle.
Even as DNA sequencing and fingerprint analyses become more accurate, they do not assist in identifying the DNA of individuals who are not already registered in a database. That is when molecular photofitting comes in and shows that criminals can always run, but now they may not be able to hide.
Molecular photofitting is a new technique that translates a person’s sequenced genome into an estimated image of their face. While not yet perfected, it does have a promising future as an objective aid to police sketch artists.
Outside of the laboratory, artificial intelligence programs are learning to assist lawyers and judges in figuring out who is lying under oath. European computational linguists Massimo Poesio and Tommaso Fornaciari used a linguistic technique called stylometry to catalog the occurrence of certain words or phrases in a speech sample. Their machine selected guilt-indicating terms via statistical learning after watching hours of statements given by defendants who were later proven to be lying. Currently, the AI robot is about 75 percent accurate at identifying a deceptive defendant — much better than chance and, with some more refinement, worth a spot in the courtroom.
Forensic science is working harder than ever before to live up to its name, and thus we are moving past unjust, unreliable evidence in pursuit of the whole truth. Although empowered by the new tools at hand, crime solving will always take insight, careful observation and creative thinking. It is elementary, my dear readers — according to the latest clues, mysteries are just science in disguise.