14

As discussed in the comments of Does normal paper currency contain enough narcotics residue to attract a drug-sniffing dog?, and further explored in the 2017 NPR article, "Eliminating Police Bias When Handling Drug-Sniffing Dogs" on Dr. Lisa Lit's UC Davis paper, "Explosive- and drug-sniffing dogs' performance is affected by their handlers' beliefs", there is some evidence that drug-sniffing jobs may be unduly influenced by their handlers.

On the surface, the study tested the abilities of fourteen certified sniffer dogs to find hidden "targets." In reality, the dogs' human handlers were also under the magnifying glass. They were led to believe there were hidden target scents present, when in fact there were none. Nevertheless, the dogs "alerted" to the scents multiple times — especially in locations where researchers had indicated a scent was likely.

This was also noted in a Cracked.com article on crime-fighting tactics that don't work which also quoted Dr. Lisa Lit's study.

So Lit set up a room complex where the dogs would be presented with multiple scents of interest (read: sausages everywhere), but no actual drugs or explosives. However, the handlers were told that they were looking for the real thing, and also that the areas with conflicting scents were marked in a certain way. The results were condemning: Only 21 out of 144 searches accurately reported nothing of interest. There were a total of 225 alerts from the dogs, each one of them a false alarm. In areas with the fake marking that the trainers were told about (and were therefore extra wary of), the dogs were twice as likely to give a false positive.

Given even the best-designed study can come up with bad results, have Dr. Lit's studies results been reproduced by other researchers?

Sean Duggan
  • 6,438
  • 2
  • 37
  • 60
  • Interesting. If, during training, when the dogs would legitimately pick up a positive hit, and they noted subtle cues from the trainers, then they'd associate those cues with the expectation of a "positive" reaction, and subsequent "reward' behavior. Looking forward to seeing the answers on this. +1 – PoloHoleSet Dec 06 '18 at 20:31
  • I just wanted to throw out there: this might be completely irrelevant. I mean, let's say it's 100% true, and the dogs give off a much higher rate of false positives if their handler thinks something is wrong. For that information to be relevant, you'd have to know the ratio of false positives to legit alerts, and the relative cost of a false positive. Realistically, a false positive doesn't cost much, and even if there were 50 false positives for every "I found something!", I'd imagine it'd still be worth it to use the dogs. – Kevin Dec 07 '18 at 15:58
  • 7
    @Kevin: The bigger concern, as I understand it, is that drug-sniffing dogs are basically used as a form of "probable cause" where a positive result from the dog gives the police the right to search a car, residence, etc. While frankly, we probably get biased results to justify "probably cause" all the time ("I noticed that the black suspect looked suspicious and thought I saw a gun, so I entered his residence. No gun was found, but I did find this baggie of drugs"), it's still worthwhile to force it out into the daylight. – Sean Duggan Dec 07 '18 at 16:10
  • @SeanDuggan I'm not saying that it's not a concern. I just get irritated when an issue is presented without the information to know whether the presented info is actually relevant. Or to put it another way: "the dogs were twice as likely to give a false positive." If the false positive rate rose from 0.001% to 0.002%, then, yeah, it doubled... but it's still not worth worrying about. Again, not saying it's *not* a concern - I don't know enough to say one way or another. – Kevin Dec 07 '18 at 16:15
  • 3
    Or to put it another way: there's a critical piece missing from Dr Lit's study: how many false alarms were there from the *control* group? Because if the control group (the one where the handlers weren't told anything special), how many false alarms were there? Sure, there were 225 false alarms from the study group, but what about the control? If there were 5, yeah, there's probably a problem. If there were 220... not so much. – Kevin Dec 07 '18 at 16:18
  • @Kevin: Ah. Yeah, that would be additional information that would be useful, which might also come from additional study of the situation. – Sean Duggan Dec 07 '18 at 16:55
  • 1
    The actual paper is at https://link.springer.com/article/10.1007%2Fs10071-010-0373-2 – Keith Morrison Dec 12 '18 at 16:10
  • 1
    "even if there were 50 false positives for every "I found something!", I'd imagine it'd still be worth it to use the dogs" - this would almost certainly be a sufficient false positive rate to rule the use of dogs unconstitutional based on the US Constitution, hardly irrelevant. In other jurisdictions the rules may be different. – Bryan Krause Dec 12 '18 at 17:46

0 Answers0