Flawed Facial Recognition Software Leads to Wrongful Felony Charge
Do you know this man? He is alleged to have stolen an 82-year old woman's wallet right out of her purse at the Commerce Township Meijer's store in September. Please contact our law firm if you recognize this individual so we can get this information to the proper channels.
The Oakland County Sheriff used their new facial recognition software to run this photo, taken from the Meijer's security camera, through their database. Apparently, the Sheriff's database included our client's long-ago mug shot. The software suggested a match and the detective in charge submitted the case to the Oakland County Prosecutor resulting in a felony charge against our client: larceny in a building.
The person identified in the photo above by Meijer's loss prevention manager allegedly stalked an elderly shopper and removed her wallet from her purse when she left it unattended; the photo depicts him exiting the store.
Problem: the man in this photo is not our client. We know this because we know our clients; we represented him in a long-ago misdemeanor to which he pled, was sentenced, and successfully completed his probation. Nevertheless, our client was charged with a felony in Oakland County solely based on this flawed identification. His preliminary examination was conducted yesterday in Novi's 52-1st District Court. We convinced Judge Robert Bondy to dismiss the charge for the prosecutor's failure to properly identify our client.
Now, our client realizes that there is a hidden cost to pleading guilty to any charged offense: the prosecutor may like you for charges down the road, years later, even when it's not you. Because you have a photo in the system, you are apparently fair game for new charges if the software suggests it's you.
How Does Facial Recognition Work?
Several years ago, we blogged about the use of this technology in the law enforcement context when it was first emerging. Here is a link to our post in The Law Blogger. Real-time facial recognition technology allows law enforcement to match any face, captured on an ever expanding number of networked cameras, with an extensive and growing database. Here is how it works:
Facial recognition technology works by analyzing and comparing patterns in facial features to identify or verify individuals. Here’s a basic overview of how it functions:
Image Capture: A camera captures an image or video of a person's face. This can be done through standard photos, surveillance cameras, or even smartphone cameras.
Face Detection: The system uses algorithms to detect the presence of a face in the image. This involves identifying facial landmarks, such as the eyes, nose, mouth, and the shape of the face. Apparently, the security cameras at Meijer's are of notoriously poor quality.
Feature Extraction: Once a face is detected, the system extracts key features, such as the distances between eyes, the shape of the jawline, and the contours of the face. These features are then transformed into a unique mathematical representation or "faceprint."
Comparison: The extracted faceprint is compared against a database of known faces. In this case, the database was the Oakland County mug shots. If the system is set up for recognition (identifying someone), it searches for a match; if it's for verification (confirming identity), it compares the faceprint against a single stored reference.
Matching and Decision: The system determines whether the new faceprint matches any stored faceprints. The likelihood of a match is given as a confidence score. Based on the score, the system will either verify the identity (if the confidence is high enough) or conclude that there is no match. We did our own analysis using our AI program and the likihood of a match between the photo above and our client was only 8.21%.
Continuous Learning (Optional): Some systems use machine learning models to improve over time. The system may learn from new faces or experiences to become more accurate at recognizing faces under different conditions (lighting, angles, etc.).
Facial recognition technology can be used in various applications, such as unlocking devices, security surveillance, identity verification, and even emotion analysis. However, this technology is not always accurate.
Can Facial Recognition Technology Get It Wrong?
Facial recognition technology can and does make mistakes. Since the outset, women, African Americans, and other dark skinned individuals have complained about the limitations of this technology. For example, a recent case filed by the ACLU against the City of Detroit resulted in a confidential cash settlement.
While facial recognition has steadily improved, several factors can lead to errors. Some common reasons for these mistakes include:
Poor Image Quality: Low-resolution images, poor lighting, or blurry photos can make it difficult for the system to detect and accurately analyze facial features. This can result in a higher likelihood of incorrect matches or misidentifications. Again, the Meijer's security camera is not of a high-resolution quality.
Angle and Orientation: Facial recognition systems typically perform best when the face is directly facing the camera. If the person is looking at the camera from an angle, or their face is partially obscured (e.g., by glasses, masks, or hair), the system may struggle to identify key features and may fail to make an accurate match.
Aging and Physical Changes: Changes in appearance over time, such as aging, weight gain or loss, haircuts, or the addition of facial hair, can affect the system's ability to match a face accurately to an existing database. Despite this, we could easily tell the person in the photo above was 50 lbs heavier than our very thin client. Dieting could not explain this difference. We were prepared to retain an expert to highlight this flaw in the Oakland County Sheriff's software.
Environmental Factors: Variations in lighting conditions, shadows, or reflections can make it harder for the system to correctly analyze the face. For example, bright sunlight, dim lighting, or glare can obscure key facial features.
Biases in Training Data: If the data used to train facial recognition algorithms is biased, the system may perform less accurately for certain groups of people. Studies have shown that facial recognition systems can be less accurate at identifying women, people of color, and individuals from certain demographic groups, especially when the system has been trained on predominantly white male faces. This is another area our expert witness was prepared to address.
Duplicates and False Positives/Negatives: False positives occur when the system incorrectly matches two different people, while false negatives happen when the system fails to recognize the same person. These errors are particularly problematic in security or law enforcement applications, where incorrect identification can have serious consequences.
Manipulation or Deception: Some individuals might attempt to deceive facial recognition systems by using makeup, photos, or even 3D models to spoof the system. Not the guy in the photo above; he simply stalked the poor 82-year old woman doing her shopping at Meijer's, stole her wallet according to the prosecutor, and waltzed out of the store without a care in the world; he almost smiles when you watch the video.
While the accuracy of facial recognition technology is improving with advancements in artificial intelligence and machine learning, it’s important to recognize that it’s not infallible and can still make mistakes, especially in challenging or complex scenarios. When it comes to law enforcement, however, it's got to be right.
Our Client's Case Was Dismissed at the Preliminary Exam for Lack of Identification.
When a client is charged with a felony, the prosecutor must prove by a probable cause standard [meaning it is more likely than not]; a) that a felony was committed; and b) that our client committed this felony. Accused individuals are afforded a statutory right to conduct a preliminary examination at the trial court before the case is bound over to the trial court; in this case, the Oakland County Circuit Court.
In our case, we knew that the Oakland County Sheriff's software erred by suggesting the person depicted above was our client. Because the prosecutor -understandibly- did not know our client, she insisted that the exam should proceed. Nor was Judge Bondy about to delay the prosecution of this case.
Our strategy was to attack the prosecutor's attempt to prove the perpetrator's identity with the proffered video evidence, and to attack the prosecutor's software with an IT expert. On the day of the exam, the prosecutor's only witness was the officer in charge. The detective had no way to tie the facial recognition process to our client. Because they had no way to identify our client, the prosecution was "dead in the water".
Notice of Alibi.
Because we knew the photo of the individual depicted above was not our client, we were certain that he was not in the Meijer's store in Commerce Township on the Sunday afternoon in question. In fact, our client had half a dozen witnesses ready to testify on his behalf that he was in his home with his wife watching the Detroit Lions host the Tampa Bay Buccaneers.
So our defense in this case, were it to proceed to the trial court, was that: a) the prosecutor cannot identify through facital recognition or any other method that our client was the one that committed the larceny; and b) we can prove with several corroborating witnesses that our client was at home with his wife watching the Lions, not out robbing old ladies.
To properly assert an alibi defense, the Michigan Court Rules require that the accused provide the prosecutor with adequate notice. Fortunately, our case was dismissed at the preliminary examination stage so we did not even get to the point of providing an alibi notice. The Oakland County Prosecutor just plain got it wrong in this case.
The Cost of Not Charging the Right Person.
Identification of the right person is critical for any prosecution. Proper identification is an element contained in the standard jury instructions for all felonies; it is really the first step in any prosecution. Getting it wrong costs our society in a few ways.
The secondary victim in this case was our client. In addition to the stress of having the head of a household charged with a felony, the primary victim in this case was the 82-year old who was out doing some Sunday shopping only to have her wallet stolen from her purse. So far this crime has gone unsolved due to the prosecutor's missteps.
While this case wound through the system, we learned of another person charged with a similar crime involving the Meijer's in Grand Blanc. Hopefully, publishing the photo of the person that the Oakland County Prosecutor suspects of stealing wallets from purses will help bring him to justice.
While we here at Clarkston Legal applaud the advance of technology, we caution law enforcement to be sure to get it right. When they came to arrest my client, half a dozen units came to his home. The embarrassment and disruption of a public arrest followed by false charges is significant; in some cases unbearable.
Again, if any of our readers recognize the person in this photo, please contact our law firm. We know where to direct the information so that it will actually do some good and hopefully solve this crime.
We Can Help.
If you or someone you know is in need of legal assistance from false criminal charges, give us a call to schedule a free consultation.