1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

Fingerprint forensics still has a future

Zulfikar AbbanyAugust 31, 2016

Fingerprints have been used as a form of identification for millennia. But 125 years ago an Argentine statistician started an experiment that led to the first murder being solved with these unique markers.

https://p.dw.com/p/1Jswp
Bildergalerie Verfassungsreferendum in Ägypten Mursi 2012
Image: AP

Fingerprints are like personal stories. Everyone's got a personal story. We like to think our story is unique. And most important: it's our version of events that is true. But what is true tends to be a matter of interpretation, and sometimes we humans get it wrong.

This goes for the story of fingerprinting, too.

Over the past few days German media has been awash with stories saying crime investigators used fingerprints to solve a murder for the first time 125 years ago. It's one version of events, all right. But it's not strictly true.

This is closer to the truth: Juan Vucetich was a fingerprint researcher and statistician. He became the head of the bureau of Anthropometric Identification at the Central Police Department in La Plata, Argentina, and in 1891, created a classification system and method to "individualize" prisoners using fingerprints. These are considered the first uses of fingerprint science by law enforcement.

It was a year later, however, in 1892, that the first murder was solved using fingerprint evidence. And it was Inspector Eduardo Alvarez, who had learned to compare fingerprints from Vucetich, who slapped on the cuffs.

Since then, we have come to rely almost totally on fingerprinting as forensic evidence, and the more digital technology becomes the norm, it is also used as biometric data to unlock our phones, authorize bank transfers, and cross-reference the identities of refugees.

Fingerprints are (kind of) unique

There is no way of knowing for sure - you would have to check the fingerprints of every living being to be absolutely certain. But at some point in our world history we decided fingerprints were unique and a perfect means for identifying individuals. A way to tell us apart. There is evidence in China suggesting fingerprints were used for identification as early as 200-300 B.C., and in Japan from 702 A.D.

Fingerprints
With the advent of biometric passports, inky fingerprints have been phased outImage: picture alliance/ZB

A millennium later, this knowledge slowly dawned on Europe and the Americas.

It's the "friction ridge skin" or "ridge characteristics" on the ends of a person's fingers and thumbs, palms of their hands and soles of their feet that make these prints - apparently - unique.

How can we be so sure that no two "latent prints" are the same? Statistics.

"[Sir William James] Herschel worked out the probability was something like one in every 86 billion, and the world population isn't that. Yet. So it's all done on statistics," says independent fingerprint expert, David Goodwin. "But it's born out ... millions and millions of comparisons are made everyday."

It's only recently, though, that there have been attempts to scientifically validate fingerprinting - to measure how well the technique performs.

"The first study that looks anything like a validation study of latent prints was published in 2011," writes Simon A. Cole, professor of criminology and law at the University of California at Irvine, in an email to DW. "As you note, that is more than 100 years after the technique first started being used. Whether one study constitutes scientific validation is a more difficult question, but most people seem to think probably not."

However, Professor Jennifer Mnookin, dean at the UCLA School of Law, says there have been new insights into this form of evidence that was originally "taken as utterly authoritative without the kind of careful, scientific study that we'd expect today."

"In the [PNAS] study, fingerprint evidence in a 'test' situation [...] does have quite a low false positive error rate, though not zero," says Mnookin in an email. "And the false negative rate - the rate at which an examiner says two prints don't match, when actually they do share a common source - is quite a bit higher. So any examiner who asserts that he or she has never made a mistake is almost certainly wrong."

Beware of humans

Crime scenes can be messy places. So while you may have a unique fingerprint from a crime scene, and global databases - including the European Union's EURODAC - stacked full of unique prints, comparing them to find a suspect can be tough.

"It's not the system, it's humans," says Goodwin, a former head of a United Kingdom Fingerprint Bureau who has also been contracted by the United Nations Office on Drugs and Crime and the International Criminal Court in The Hague.

Fingerprints - iPhone
Apple introduced Touch ID in 2013 - but Germany's Chaos Computer Club said the technology was not secureImage: picture-alliance/dpa/K. Nietfeld

Humans can be real fools when it comes to fingerprints. "You'll get drug dealers who scar up their fingers deliberately to avoid capture, but they're only doing the tips of their fingers and they forget they've got palms and inter-digital bits with ridge detail on - and of course once they cut up their fingers that becomes a permanent scar and a permanent scar is also a form of identification," says Goodwin.

And then there's the examiners. They make mistakes too.

A print lifted from a crime scene may be smudged with blood, or at best partial. Police may find two matching ridge characteristics on a print in a database, and use interpretation and a form of "cognitive bias" to fill in the blanks for the rest.

This is in part what happened in a well-documented case involving Brandon Mayfield, a lawyer from the US, who was wrongfully identified by the FBI as a suspect in the Madrid terrorist bombings of 2005 based on a partial print from one of the scenes. Spanish authorities later linked the print to an Algerian man called Daoud Ouhnane.

Cognitive bias

The problem, says Dr Itiel Dror, a cognitive neuroscientist at University College London, is that crime scene investigators often work from the "suspect to the evidence, rather than from the evidence to the suspect." In other words, police may unconsciously - or consciously - interpret what evidence they have to fit the suspect. It's called cognitive bias.

"In any [forensic situation] that involves interpretation or subjective judgement, we need to worry about cognitive bias," says Dror. "That's where there is irrelevant contextual information - information that the forensic examiner does not need, for example, if the suspect has confessed, or if eyewitnesses have identified the suspect. That information can affect how they perceive and interpret what they are looking at."

Simon Bunter, a forensic scientist in the UK, even suggests personal ego has its role. In a 2016 report, Bunter says "job satisfaction in solving high profile" cases can influence an outcome. Time and budgetary constraints can also play a part. It all means interpretations can vary and change.

"It's fascinating how [Dror and his associate David Charlton] manage to get the same experts five or 10 years on to look at a mark in a different context and come up with a different decision based on the severity of the crime," says Goodwin of Dror's research.

Dror has developed a method for limiting cognitive bias called "Linear Sequential Unmasking." He says it is being considered or adopted by many countries around the world, "even Italy," but Germany has been "less than enthusiastic." Dror says Germany seems to be in a "state of denial" about cognitive bias in forensics and warns "it would be a shame" if authorities here waited for a scandal to break, such as the Mayfield case, before it took seriously the risk of cognitive bias.

Fingerprints - registering asylum seekers
Fingerprints go digital: there's little sign of their becoming obsolete in criminal investigationsImage: DW/S. Pabst

Asked whether German authorities are in a state of denial, Detective Chief Superintendent Bettina Neukamp at Berlin's state office of criminal investigations, says "since we are all highly trained, we're not worried about it."

Fingerprint evidence was deemed admissible by West Germany's Federal Court of Justice in a 1952 ruling - a suspect could be convicted on this evidence alone. Since then, says Neukamp, there have been no cases of incorrect fingerprint interpretation in the country.

But all it would take is one case. Would removing humans from the process help?

"It is likely that eventually latent fingerprint examination will [...] be automated," says Mnookin, "but the technology isn't yet adequate for that."

An obsolete technology?

So after 125 years - and with new technology, such as DNA testing, and other biometrics on the block - how much longer will courts "blindly" accept - as Dror puts it - fingerprinting as a unique identifier? Isn't it at risk of becoming obsolete?

"I have long thought we will know it is obsolete when law enforcement agencies stop maintaining fingerprint databases and concentrate on DNA databases," says Cole. "That has not happened yet."

But could it? Even DNA testing could lose its significance if techniques such as human gene editing allow us to alter our genetic IDs - or our very understanding of identity.

"Some of these other technologies you mention," says Mnookin, "human gene editing or biochips - they may eventually change our idea of the human self in meaningful ways, but fingerprint evidence is about something much more mundane, [and it is] likely to remain important: was this body, this finger, this person present at the scene of the crime?"