In 1999, a woodcutter named James B. Grinder confessed to a 15-year-old murder, the death of a 25-year-old woman named Julie Helton. A short time later, he recanted, contradicting himself over and over. His blood had been taken and compared against the crime scene samples — but with such an old crime, local police were worried the case might fall apart, so the sheriff called in a doctor he had seen on the news. The doctor's name was Lawrence Farwell, and he was promoting a next-generation tool called brain fingerprinting. It was an advanced lie detector that claimed to look into a suspect's brain to see if they were familiar with the details of the crime. This was the first time the technique had been used in an active criminal case, although Farwell had tested the technique with scientists at the FBI. So far, he said, the accuracy rate was 100 percent.
The local police were thrilled to try it out, so Farwell set up his computer at the prison where Grinder was being held, along with the bundle of wires and electrodes used to take EEG readings of brain activity. Grinder sat in front of a screen in his orange jumpsuit, with a blue strap over his forehead to secure the device. Behind him, Farwell asked questions and monitored the results on a screen, grilling the convict about specific details of his crime. By the time the test was over, Farwell was convinced. Faced with overwhelming evidence, Grinder pled guilty and was sentenced to life without parole. "There is no question that J. B. Grinder raped and murdered Julie Helton," Farwell told a local paper after the plea. "The significant details of the crime are stored in his brain."
"A revolutionary scientific technology to detect whether specific information is in a person's brain or not."
Grinder's case was the first time Farwell's technique was used in an investigation, but it wasn't the last. Farwell tested the device on two other convicted murderers in the following years, Terry Harrington in Iowa and Jimmy Ray Slaughter in Oklahoma, before moving the technique to a bigger stage. Farwell founded a company, Brainwave Sciences, to build ready-made brain-reading rigs that could be deployed on a mass scale. Naturally charismatic, Farwell was a hit with the media, feted by major outlets from 60 Minutes to Time. In 2013, Brainwave Sciences made its first sale to Singapore's police force. Since then, the technique has also popped up in Indian courts, used with Farwell's blessing. In August, the Florida State Police bought another batch of the devices for everyday use. Krishna Ika, Brainwave's CEO, describes the product as "a revolutionary scientific technology to detect whether specific information is in a person's brain or not." Based on the company's testing, Ika says he's seen an accuracy rate of "close to 100 percent."
But as the technique has spread, it's also struggled to shake accusations of shoddy testing, inflated claims, or even outright pseudoscience. Farwell has conducted extensive testing, but it's all been behind closed doors, whether it's with government agencies or prospective clients. He maintains that his test is based on solid neuroscience and that it's never produced a false result, but academic neuroscientists complain that his methods are effectively secret and have never been subject to public review. As Farwell's technique arrives in police stations and courtrooms, it raises a serious question: when Farwell looked into J. B. Grinder's mind, how much could he really see?
Attach two pieces of conductive material to a person's scalp, forming a circuit, and you can begin to pick up on what's happening inside. Monitor the circuit, and you'll get a wave-like reading, a constant ebb and flow of electrical activity. Once you're accustomed to the rhythms, you can pick out disturbances, which often lead back to specific events in the brain. The test is called an electroencephalogram or EEG, and it's one of the most common ways to check patients for seizures or abnormal brain activity.
In 1965, a group of scientists discovered a distinctive surge of electrical activity in the EEG wave when a person saw something familiar, usually arriving 300 milliseconds after the object was revealed. They called it the P300 response, and while the neurological origins of the surge are still unclear, the behavior has been replicated over and over in the decades since. A scientist might trigger it with an unexpected low note after a series of high notes, or showing a subject his best friend's face mixed in with pictures of strangers.
When the P300 is used in interrogations, the questions are more pointed. Was the victim killed with a knife? Was the victim shot? Was the victim strangled? It's called the Concealed Information Test, or CIT. There might be a dozen such questions, all covering a single concrete aspect of the crime, but only one of them describes what really happened. If the suspect knows the victim was shot, he'll show a P300 response when that specific question is asked. A conventional polygraph relies on flashes of sweat from the autonomous nervous response, a physiological panic brought on by lying, but the P300 is entirely confined to the brain, making it significantly harder to beat. Suspects can still ignore the questions entirely or try to trigger recognition through other means, but questioning protocols can be arranged to catch them in the act. The usual polygraph tricks won't work, and the system is much more resistant to adversaries.
But while Farwell has often used the science behind the P300 to justify his techniques, many scientists aren't happy about it. In 2012, a group of researchers struck back against Farwell's work, writing in Cognitive Neurodynamics that his research "is misleading and misrepresents the scientific status of brain fingerprinting technology." While the P300 can be triggered by any event that violates a subject's expectations, Farwell usually describes it as a direct view onto a suspect's memory. Farwell had boasted about 10 different field studies, but only two of the studies were peer-reviewed, totaling 30 participants in sum. The process still maintains a nominal 100 percent success rate, but it ends up duplicating some participants and excluding others to get there. "The review violates some of the cherished canons of science," the piece concludes, "and if Dr. Farwell is, as he claims to be, a ‘‘brain fingerprinting scientist’’ he should feel obligated to retract the article."
"The review violates some of the cherished canons of science."
Even for those who believe in the science behind the P300, there are real questions about how it will stand up in an actual investigation. A recent study compared P300 with the skin conductance response, used in the common polygraph lie detector, and came away with mixed results. When innocent participants were included, the P300 far outpaced the traditional polygraph, resulting in far fewer false-positives. But when the participants had to work through an actual crime scenario, stealing a purse and then being confronted with details, the P300 was often less reliable than the polygraph. The questioners were forced to guess at what actually happened, introducing new uncertainty and driving up error rates. Many guilty suspects ended up passing the test simply because they hadn't paid attention to the objects in the test.
Another concern is that the real benefits are coming from the method of questioning rather than the brain measurements themselves. Traditional lie-detection protocols look for deception; if you say you didn't kill your wife and the machine spikes, that counts as evidence that you did. But the Concealed Information Test takes a different approach, looking to prove that a suspect knows things that only a guilty person would know. That means collecting specific information and keeping it out of the public eye until the information can be used in a test. It's a lot more work than the average polygraph, but in exchange, cops get a less error-prone test, and one that's particularly resistant to false positives that might put an innocent person in jail. The only real change here is the interrogation strategy. CIT appears to be more reliable, even used with traditional polygraph equipment, and it's already popular with Japanese law enforcement and the FBI. But if the benefit of the test comes from asking different questions, why bother with brain measurements at all?
What's left is a mix of proven techniques and dangerously exaggerated benefits, which has many observers worried about the effect it could have on the legal system. "It's not ready for the courtroom yet," says Jane Moriarty, a chaired professor at Duquesne's School of Law. "There's not enough testing behind it to say, yes, we should use this as evidence to convict people." In particular, Moriarty is concerned that laboratory testing can't reliably replicate the brain activity of a suspect being interrogated for an actual crime. It's easy for Rosenfeld, Farwell, and others to interrogate student volunteers, but when it's an actual suspect being questioned about an actual murder, the neurological reactions may be significantly different. How does the test hold up on neurologically atypical suspects, like psychopaths or the mentally ill? It's not clear how scientists can control for those factors, and they could leave a dangerous loophole if the method is more widely adopted.
So far, brain fingerprinting has only been admitted into court evidence once: in Harrington v. State (2003), in which the court mentioned the finding but did not rely on it. But in the years since, courts have ruled against admitting fMRI lie detection evidence for the same concerns voiced by Moriarty, suggesting lab testing wasn't yet applicable to a real-life case. There's been no ruling on Farwell's method specifically, leaving it in a kind of legal limbo within the US. At the same time, the technique is already gaining adherents in Australia and India, playing off its early successes in American courts.
Still, the more judges let brain fingerprinting into court across the world, the more precedent it will have as evidence, and the harder it will be to dismiss. That vicious cycle has kept bad science in courtrooms before, whether it's bite-mark tests or handwriting analysis. In the Cameron Todd Willingham case, overly confident arson investigators turned an accidental fire into a triple murder, sending an innocent man to his death. Once a local analyst is seen as an expert in a given technique, it's often very hard to wean judges off the technique, even when the science has been largely discredited. "A lot of these kinds of forensic science — bite marks, arson experts, handwriting analysis — they testify as experts and they have for decades," Moriarty says. "One of my concerns is, if we let it in, will we ever be able to get it out again?"
But while researchers worry, Brainwave Sciences is busy spreading its wares. Ika says the company is in negotiations with six other countries, who are intrigued by the device's possible applications in counter-terrorism screening. "It's still close to 100 percent reliable, with no false positives or false negatives," he says. "We're very eager to see if someone can prove that wrong."