News Room

The Promise and Peril of Facial Recognition

Flawed Police Use Leads to Wrongful Arrests

By Shant Karnikian

Most of us use facial recognition technology regularly, barely giving it a second thought. It helps us unlock personal devices like phones and laptops, organize photos, retrieve favorites from our overflowing online albums and social media accounts, and make payments from digital wallets. It’s also a valuable assistive technology that helps visually impaired individuals navigate their surroundings, identify familiar faces, and smooth social interactions.

Broader applications of this technology include security systems and safety measures in airports or at borders, scanning for known criminals at large concert venues or crowded arenas, and monitoring a store for potential shoplifters. Businesses can also use it to replace employee ID badges.

So what is it exactly? Facial recognition technology is software that uses AI biometrics to detect and analyze a face in an image (photo or video). It maps the individual’s features, creating a “facial signature.” The system then compares the facial signature to a database, including millions of images, to look for a match.

Although the origins of facial recognition systems date back to the 1960s—and many fictionalized versions have been depicted in sci-fi and adventure films—the technology has exploded in the last roughly 15 years.

Facial recognition software and the cameras that feed it are already all around us, whether we like it or not. Some of the potential problems—for example, broad privacy concerns—are obvious. But there are other, even more complex, and pressing ethical and civil rights issues.

The algorithm’s accuracy depends on the volume and diversity of images the system has trained on. There are serious fallibilities in AI, including well-documented racial and gender bias. Research has demonstrated that the technology is at its least reliable when identifying women, people of color, and non-binary individuals.

This means, as the New York Times pointed out in 2020, that it can produce both false positives (identifying and incriminating an innocent person) and false negatives (not identifying a guilty person), both of which can be dangerous.

In June 2020, after widespread protests against police brutality, IBM, Microsoft, and Amazon put a moratorium on selling their facial recognition technology products to law enforcement. However, as the Times explained, “The gestures were largely symbolic, given that the companies are not big players in the industry.”

In recent years, local police departments have often purchased proprietary facial recognition systems from smaller private companies that develop their own algorithms and databases. Those law enforcement agencies have little oversight on how an off-the-shelf system has been trained and tested or what sources its database draws on. “The facial recognition software that law enforcement agencies use isn’t currently available for public audit,” the Times reported, and the public cannot be sure of the accuracy or appropriateness of its use. Some cities, including San Francisco, Boston, and Minneapolis, have banned the use of the technology in police work—other cities have embraced it.

Michigan law enforcement agencies, in particular, have come under fire numerous times for flawed use of facial recognition systems, resulting in wrongful arrests. Three lawsuits targeting Detroit police reflect the growing problem.

In January 2020, the first known U.S. case of wrongful arrest based on facial recognition technology happened in Farmington Hills, Michigan. While his wife and daughters looked on, Robert Julian-Borchak Williams, a heavyset black man, was arrested on his front lawn upon arriving home from work. Accused of stealing nearly $4000 worth of timepieces from an upscale Detroit boutique, he was held in a detention center for 30 hours. The arrest was based solely on facial recognition technology that put Mr. Wiliams in a lineup of six, from which a single witness misidentified him. The photo file sent to investigators read: “This document is not an identification….It is an investigative lead only and is not probable cause for arrest.” The ACLU took his case, which was later dismissed.

According to the AP, “Another Black man, Michael Oliver, sued the city in 2021 claiming that his false arrest because of the technology in 2019 led him to lose his job.”

In August of 2023, a lawsuit was filed on behalf of Porcha Woodruff. On February 16th, 2023, Woodruff, a 32-year-old Black woman, 8 months pregnant, was getting her two children ready for school when six Detroit police officers arrived at her home with an arrest warrant for robbery and carjacking. Again, in this case, technology had put her in a lineup. According to the suit, Woodruff’s case was dismissed by the Wayne County Prosecutor’s Office a few weeks later due to lack of evidence.

Now, LaDonna Crutchfield, 37, has filed suit against Detroit police alleging that police falsely used facial recognition technology to identify her as their prime suspect in a shooting. On Jan. 23, 2024, Crutchfield, a Black mother, was at home in Detroit with her children when police arrived and took her away in handcuffs. Her name did not match the suspect’s; she was significantly younger and five inches shorter. She was released after police provided her with fingerprints and DNA. Detroit police deny using facial recognition in Crutchfield’s case.

Although they were not ultimately convicted, these individuals and their families suffered devastating consequences nonetheless: acute emotional distress, humiliation, the threat (or reality, in Mr. Oliver’s case) of losing their jobs, and even the possibility of losing their children due to shoddy police work.

KBK is dedicated to protecting consumer rights by holding public and private entities accountable for privacy violations and other wrongful conduct. If you believe you and many others like you have been somehow injured, cheated, or otherwise harmed by unfair business practices, give us a call and let us help you protect your rights.