The Dark Scanner; and Your Rights

by Paul D. Sellers on Aug. 31, 2021

Civil & Human Rights Civil Rights Civil & Human Rights  Constitutional Law Criminal 

Summary: The expansion of facial recognition technology is moving faster than the law can keep up with.

Rolling Stone magazine once called Philip K. Dick "the most brilliant sci-fi mind on any planet." In one of his most famous works, A Scanner Darkly (published in 1977), the author imagined a computer that "projected every conceivable eye color, hair color, shape and type of nose, formation of teeth, configuration of facial bone structure."

Of course the American novelist imagined this computer being a part of a dystopian hellscape that would be unimaginable to readers in 1977. This is because there has always been a "ubiquitous presence of privacy in both literature and law," which does more than suggest that "there is a generally accepted definition of this right." Jonathan Turley, Anonymity, Obscurity, and Technology: Reconsidering Privacy in the Age of Biometrics, 100 B.U.L. Rev. 2179, 2220 (2020).

Yet as you read this post, law enforcement facial recognition networks include over 117 million American adults. At least twenty-six states allow law enforcement to run or request searches on their databases of driver's license and identification photos, even though these databases primarily contain information about law-abiding Americans.

How often are our faces being compared? And in what ways is law enforcement handling this biometric information? Frankly, we don't know. The incomparable Tony Webster tried to provide some answers, at least as it relates to Minnesota. But the more he dug, the more difficult it was to get answers. In fact, when he took his request all the way to the Minnesota Supreme Court in 2018 (and won), it should've resulted in a treasure trove of information. Instead, the data he requested about the use of "facial recognition OR face recognition OR face scan OR face scanner" has still yet to appear. Webster v. Hennepin County, 910 N.W.2d 420, 424 (Minn. 2018).

Which might not be all that scary, really, unless you're "highly melanated," i.e. not white. See, one thing these facial recognition algorithms are really bad at is effectively identifying people of color. The National Institute of Standards and Technology (NIST) conducted a study that evaluated 189 different algorithms from 99 developers and found that the algorithms generated higher rates of false positives for Black faces--sometimes up to one hundred times more false identifications--than white faces. As this technology pervades our society, its embedded bias is problematic for minorities who have historically been disproportionately targeted by law enforcement.

To be clear, facial recognition is probabilistic; the technology produces more or less likely matches, not definitive matches. Which should've led law enforcement to rely on actual evidence rather than blindly trusting this new "dark scanner." Alas, that didn't happen. Add in lazy police work and the result is a recipe for disaster.

Such was the fate of Robert Williams, who was falsely accused of being a watch thief. Why did officers arrest Mr. Williams? Because, according to Detroit Police Department's 'all-knowing' facial recognition software, the security camera footage of the pennyweighter matched a photograph in Michigan's Driver's license database. Literally, that was the only evidence the police had to suggest Mr. Williams had any involvement in the heist. Kashmir Hill, Wrongfully Accused by an Algorithm, N.Y. TIMES (June 24, 2020).

"They never even asked him any questions before arresting him. They never asked him if he had an alibi. They never asked if he had a red Cardinals hat. They never asked him where he was that day. They just arrested him and hauled him off to jail." -- Michigan ACLU

So how did these algorithms get so powerful and trusted? That's a question for... Algorithmic Justice League. Hahaha. No, really. Sounds made up, doesn't it?

Founded by computer scientist Joy Buolamwini in 2016, AJL aims to raise awareness of the social implications of artificial intelligence through art and research. Buolamwini founded the Algorithmic Justice League in 2016 to expose the ubiquity of such bias in artificial intelligence and the threat it poses to civil rights, after a personal experience with biased facial detection software.

"While law enforcement agencies are increasingly integrating facial recognition technology into their daily operations, they are not implementing sufficient safeguards to ensure the accuracy of their systems." Gabrielle M. Haddad, Confronting the Biased Algorithm: The Danger of Admitting Facial Recognition Technology Results in the Courtroom, 23 Vand. J. Ent. & Tech. L. 891, 898 (2021).

In Mark Boal's prescient and profound 1998 Village Voice article, Spycam City, he noted that "[a] hundred bucks at a computer store already buys face-recognition software that was classified six years ago, which means that stored images can be called up according to biometric fingerprints." AND THAT WAS 23 YEARS AGO(!)

Can the Courts protect us? Not through baby steps. "Continuing to redefine decisional and informational privacy is unlikely to offer the broad foundation needed to fully protect biometric interests." Turley, 2217. But if people are getting arrested based solely the use of this technology, then the Courts must strike back at the executive branch when "studies reveal that racial and gender biases permeate facial recognition technology: some commercial algorithms misclassify white women as men at a nineteen percent error rate and women of color as men as often as thirty-five percent of the time." Matthew Doktor, Facial Recognition and the Fourth Amendment in the Wake of Carpenter v. United States, 89 U. Cin. L. Rev. 552, 557 (2021).

As our watchman, Philip K. Dick once said, “the true measure of a man is not his intelligence or how high he rises in this freak establishment. No, the true measure of a man is this: how quickly can he respond to the needs of others and how much of himself he can give.”

This is exactly how Paul D. Sellers feels about defending your rights: Respond quickly and give everything he can.

If you believe that you have been caught up in the sci fi dystopian dragnet created by law enforcement or government use of facial recognition technology, you must call Minnesota Legal Defense as soon as possible to protect your rights.

Legal Articles Additional Disclaimer

Lawyer.com is not a law firm and does not offer legal advice. Content posted on Lawyer.com is the sole responsibility of the person from whom such content originated and is not reviewed or commented on by Lawyer.com. The application of law to any set of facts is a highly specialized skill, practiced by lawyers and often dependent on jurisdiction. Content on the site of a legal nature may or may not be accurate for a particular state or jurisdiction and may largely depend on specific circumstances surrounding individual cases, which may or may not be consistent with your circumstances or may no longer be up-to-date to the extent that laws have changed since posting. Legal articles therefore are for review as general research and for use in helping to gauge a lawyer's expertise on a matter. If you are seeking specific legal advice, Lawyer.com recommends that you contact a lawyer to review your specific issues. See Lawyer.com's full Terms of Use for more information.