The Double-Edged Sword of Facial Recognition Technology

On the television crime drama FBI Special Agent Jules Valentine brusquely orders an underling to run a photograph through facial recognition to identify a suspect. And ~boom~ after a rocket-speed search the computer spits out a name and address. Field agents get to work and in no time the bad guy is under arrest.

I’m here to tell you it is not that easy and it’s not that accurate.

Facial recognition programs are notoriously error prone, often misidentifying an innocent person as potentially guilty. At the same time these programs have proven to be wildly successful in catching criminals, both minor offenders and more violent ones like child rapists.
Like most controversies these days there is a wide chasm of opinion. Facial recognition is either a great gizmo in law enforcement’s tool belt or it is another means to perpetuate racial inequity in the name of public safety.
Here are some facts.
A recent federal study of 189 different facial recognition algorithms confirmed previous research showing facial recognition systems come to shockingly wrong identifications when searches involve people of color (especially women of color), the very young and the elderly. Native Americans had the highest rate of misidentification. Asian and African Americans faces were up to 100 times more likely to be incorrectly identified when compared to searches for white male suspects. Pacific Islanders are also often misidentified.

Photo from Piqsels - public domain photograph
Features Algorithms Search For – Space Between Features

Case in point: Last January, Robert Williams, a gainfully employed, married father in Michigan was shocked when he was handcuffed on his front lawn by Detroit Police. Surveillance video showing a heavy-set Black man shoplifting expensive watches had been run through a facial recognition program and Williams’ face came up a match. Williams had no police record, repeatedly proclaimed innocence and if the arresting officers had asked he could have proven he was at work that day. Instead, the humiliated Williams was arraigned on charges of first-degree theft and held for 30 hours. Ultimately, the charges were dropped but at last report his record has still not been expunged.

Robert Williams Maintained Innocence – Cops Didn’t Listen

One big complaint is that police are not required to reveal that they zeroed in on a suspect via a facial recognition program. As Senior Public Defender Aimee Wyant in Pinellas County, Florida put it, “Once the cops find a suspect,” she said, “they’re like a dog with a bone. That’s their suspect. So, we’ve got to figure out where they got that name to start.”
Its reported that one in four U.S. law enforcement departments have used facial recognition in the search for suspects but there are no definitive statistics kept on the percentage of error. We know minorities and women are prone to misidentification but just how frequently does that happen?
The FBI, for example, runs more than 4,000 checks per month using a nationwide hodgepodge of photographs of nearly 120 million Americans. These photos come from state’s driver’s licenses, mug shots, juvenile records and other databases. Cooperating states, in turn, get access to the FBI’s system.
Since half of all American adults are in the FBI system chances are high that your photo is in that database. Could you become another Robert Williams?
There is little oversight of the nation’s facial recognition systems even though more and more organizations are using it – from surveillance at airports and border crossings to corporate and community security. Its use is more widespread than you can imagine.

Members of Congress Falsely Identified by Facial Recognition Software!

A couple years ago the American Civil Liberties Union ran a test on Amazon’s Rekognition facial rec program. Photos of every member of the U.S. Congress were scanned for possible matches with a vast array of mugshots. Astonishingly, 28 members were falsely identified as matching someone in the database.
Still cop shops across the country can list all sorts of closed cases that began with a trip through facial recognition software. Convictions have been secured for child sex abuse, property crimes, credit card fraud, burglaries, robberies and car theft. Suspects have been identified in cold case shootings and incidents of road-rage.
Good detectives know that a facial recognition photo match is only the beginning. Further investigation of alibis, witness statements and forensic evidence analysis is always required before arrest. Has that always happened in the past? No. Do cops learn from their mistakes. Let’s hope so.



  1. Diane Dimond on July 13, 2020 at 12:59 pm

    Reader Paul Binotto writes:

    These findings are even more alarming when the implications are considered against the widespread usage of FRT by the CCP to track. coerce, and control their people. A misidentified face can literally mean the subsequent missing of that face in society everafter.

  2. Diane Dimond on July 13, 2020 at 1:01 pm

    Reader Press On … writes:

    Like over crowding rats in a cage they begin to attack the innocent. Police use of such technology must be stopped forthwith. Law enforcement and judicial system must disclose all exculpatory evidence at time of arrest. Yes we have Police and Judicial misconduct. So called circumstantial evidence must not be used to arrest, only questioning. “You’ve got to have real evidence in fact, not stipulations that can’t be established as proofs. This injustice and police misconduct must not stand!

    Bravo, well stated. Thank you for your work that helps all of us.

  3. Diane Dimond on July 13, 2020 at 1:01 pm

    Reader D J writes:

    The justification for usage of this technology for policing is the 1st step to becoming a mass surveillance state to intrude on our freedom and rights

    This is how the CCP exerts its control. It has emboldened the CCP to act with impunity persecuting the people of HK, Falun Gong, Uyghurs, and their own people.

    We must not allow this to happen on our shores.

  4. Diane Dimond on July 13, 2020 at 1:02 pm

    Reader Boz writes:

    Just say no. Use a dumb phone with no camera. Do not post pictures on social media. Vote against cameras in your local community.

  5. Diane Dimond on July 13, 2020 at 1:03 pm

    Reader jaydan.kisinger writes:

    because there are more white males in the US, there is a more diverse array of faces and more variety in facial structures so that enables a more complex and accurate system. if there were more black, asian, and definitely native american faces on the database, that would allow for more specific identification.

  6. Diane Dimond on July 13, 2020 at 1:26 pm

    Reader torpido@edgecitykid writes:

    Somewhere, perhaps deep underground or on an inaccessible mountain top, someone is frantically working on mask recognition software. Could be the next Zuck.

  7. Diane Dimond on July 13, 2020 at 1:30 pm

    Reader Abe Jackson@appplejack003 writes:

    The Chinese government has done this for years. They don’t see people as individuals but just a herd to control like in a fuckin’ video game.

  8. Diane Dimond on July 13, 2020 at 1:50 pm

    Reader Cliff Darnell writes:

    The future nothing will be private ….

    Oh that’s now

  9. Diane Dimond on July 13, 2020 at 2:01 pm

    ReaderJonathan Swartz writes:

    I think that it’s pretty scary that someone we don’t know can have access to our information through facial technology.

    • Diane Dimond on July 13, 2020 at 2:01 pm

      DD replies to Jonathan:

      Yet in ‘normal’ times you are likely out and about in your community where there are countless video cameras capturing your likeness, Jonathan. And if you share any photos on social media of yourself – bingo – your face is out there and could already be included in the FBI’s major database.
      Real privacy is now a thing of the past I’m afraid.

  10. Diane Dimond on July 13, 2020 at 5:25 pm

    Reader Joya Colucci Lord writes:

    I’m not sure how I feel about it. My first thought is, “if you’re innocent, you have nothing to worry about.” My second thought is, “I’ve had so many technological screw ups, what happens if you’re innocent and it screws up?” There is so much refining before the technology is reliable enough for everyday use.

  11. Diane Dimond on July 13, 2020 at 5:25 pm

    Reader William Drummond writes:

    When law enforcement gets a new tool, it becomes more effective in making arrests, and prosecutors get more ammo to gain a conviction, which then sends the offender to prison. We have 2.3 million souls behind bars now. The Covid 19 epidemic is running rampant inside the prisons and jails and making its way back into the larger society. If facial recognition makes the prison population swell, it does not improve our lives.

  12. Diane Dimond on July 14, 2020 at 4:28 pm

    Reader Klexius Falcon Kolby Dupaix writes:

    Research, and Laws need to be implemented prior to releasing this technology, like everything else.

  13. Diane Dimond on July 14, 2020 at 4:31 pm

    Reader Richard Hydell writes:

    It’s going to happen as well as chips implants etc . People born in the future will not even second guess this .
    A few years ago I was sitting with my brothers 92 year old mother in-law . I asked in all your years what is the worst thing that’s happened to the planet. She pointed to the TV .

  14. Diane Dimond on July 14, 2020 at 4:32 pm

    Reader Nancy Spieker Robel writes:

    We’ve already entered the twilight zone of technology. While we were sleeping, these technological intrusions have been surreptitiously implemented. Good or bad they are here to stay. But my guess is, like everything else man creates, the abuses will quickly follow the original good intention.

  15. Diane Dimond on July 14, 2020 at 4:32 pm

    Reader Marjorie Bard writes:

    Orwellian. We saw it coming. It will get “worse.”

  16. Diane Dimond on July 14, 2020 at 4:39 pm

    Reader yatesjude writes:

    You think its bad until it saves your life or the life of a loved one

Leave a Comment