My Image

The Blacklist

Face identification software is quick to justice, but things aren't always what they seem.

My Image
Porcha Woodruff 2023
Sunday Morning / August 13, 2023

Porcha Woodruff filed a lawsuit against the City of Detroit this week alleging false arrest; false imprisonment; and violation of her Fourth Amendment right to be protected from unreasonable seizures. Woodruff joins a milieu of lawsuits against Detroit police following erroneous arrests resulting from facial recognition software.

Woodruff, who was eight months pregnant, was dressing her two children for school in February when she noticed six Detroit police officers approaching her front door. Woodruff was arrested, detained, processed, jailed, and released after 11 hours on a $100,000 bond to an emergency room where she suffered a panic attack; was diagnosed with dehydration; and her unborn child with a low heart rate.

Woodruff, 32, who’d been detained nearly a decade earlier for driving with an expired license, had been matched by Detroit’s facial recognition software to recent video footage of a carjacking suspect. Her 10-year old picture was merely one of many the facial recognition software program had presented for consideration.

There is no comprehensive data on how many law enforcement agencies use facial recognition technology, according to the Pew Research Center. However, a single provider confirms it has 3,100 U.S. law enforcement agencies as their clients.

While the case against Woodruff was dropped in March, Woodruff’s lawsuit claims that “facial recognition technology has long been known for its inherent flaws and unreliability, particularly when attempting to identify black individuals. It should be understood that facial recognition alone cannot serve as probable cause for arrests."

Algorithm Inequity

Hundreds of millions of DMV and mugshot photos make up facial recognition databases that are used by U.S. law enforcement without consent, legislative oversight, and compound into prompts of racial bias.

Beginning in 18th century New York, “Lantern Laws” required Black people to carry lanterns after dark to be publicly visible. Today, they’re tethered to algorithms. In 2022, Black Americans were 4 times more likely to be arrested and jailed than White Americans, and they were incarcerated on average 12 days longer, according to the Pew Research Center. Consequently, Black people are over represented in mugshot data which face recognition uses to make predictions in a feed forward loop leading to disproportionate arrests.

In 2016, for example, Project Green Light installed high-definition cameras throughout the city of Detroit. The data, which streams directly to Detroit PD, tested for face recognition against criminal databases, driver’s licenses, and state ID photos of nearly every single adult in Michigan. However, the PGL stations were not distributed equally. Surveillance cameras avoided White and Asian enclaves; and were concentrated in Detroit’s predominant communities of color. Moreover, Automated Emotion Recognition (AER) research, which utilizes video to capture facial expressions and speech, was also being used to surveil something more than images or location. AER was scoring personality factors to predict criminal behavior.

Man Machine

In the 18th century, the belief that facial expressions revealed our morality and mental health was an accepted science in the Western world. In 1827, the world's oldest surviving photograph was captured by Joseph Nicéphore Niépce using a technique called heliography. Eerily, the first image called "View from the Window at Le Gras" was a study of man in his environment.

My Image
"View from the Window at Le Gras, 1827

Automated Facial Recognition (AFR) came along in the 1960s when Woody Bledsoe, Helen Chan Wolf, and Charles Bisson began teaching computers to recognize human faces. Their early facial recognition project was dubbed the “Man-Machine.”

In 1970, Takeo Kanade publicly demonstrated a face-matching system, and the Defense Advanced Research Project Agency (DARPA) and the Army Research Laboratory (ARL) established the face recognition technology program FERET in 1993. Their automatic face recognition capabilities were designed to be deployed in a real time "to assist security, intelligence, and law enforcement personnel in the performance of their duties.”

Today, companies like Clearview AI scrape 10+ billion images from websites like YouTube and Facebook and sell access to their database to the FBI. Vigilant Solutions captures license plates from billions of cars parked outside residential homes, commercial and office buildings. They sell access to their database to the FBI, and 1 in every 4 police stations in the United States. Finally, ODIN Intelligence uses facial recognition to identify the homeless; collecting sensitive personal information such as age, arrest history, public assistance and welfare records. REAL IDs will become mandatory for every U.S. citizens by May 7, 2025. Each will point to a personal file held by the FBI.

Still in development, the FBI is spotlighting physiognomic analysis in certain cities to offer clues beyond identity. Through engaging private companies who track our texts, IMs, video telephony, GPS, search habits and spending sprees, the FBI it seems is no longer content with knowing our identity. They now want to predict our future behaviors, too. In the spirit of the BRCA Gene Mutation technology, we may conceivably see a day when six white police officers preemptively approach our door for a crime we’ve been 'predetermined' to commit.

A.I. Bill of Rights

Among the great challenges of our time is the use of technology, data, and automated systems in ways which threaten American liberty. In America and around the world, algorithms used to track new customers and criminals are in fact creating harmful bias and discrimination while promulgating inequity. In brief, surveillance and data collection undermines privacy, and calls into question a democracy that purports a guarantee of personal freedom.

The White House Office of Science and Technology Policy has identified five principles that can and should guide the design, use, and deployment of automated systems to protect the American public in the age of artificial intelligence. They include, Safe and Effective Systems > Algorithmic Discrimination Protections > Data Privacy > Notice and Explanation > Human Alternatives, Consideration, and Fallback.

While the Police Reform Bill may contain stipulations to restrain the use of face recognition technologies and AFR algorithms, more pungent is the tech response: IBM discontinued its system; Amazon announced a one-year freeze on police use of Rekognition; and Microsoft has suspended sales of its face recognition technology to the police until federal regulations are in place.

My Image

Because we all should be able to unlock our phones without an algorithm there to greet us. Or in Porcha Woodruff’s case, six white police officers. “I don’t feel like anyone should have to go through something like this,” Woodruff says, in a prescient warning to facial recognition software. “We all look like someone.”

My Image
My Image

Make sense of the week's news. Charlatan reviews the world's show & message.



Politicians & Statesmen


Features, Articles & Essays


Thought Leaders & Influencers


Creed, Mission & Crew


Halls of Power Worldwide


The Year's Most Compelling Stories


Media Kit


All the World's a Stage