The well documented wrongful arrest of Robert Williams is symptomatic of what happens when human beings delegate decision-making to technology that is not well proven or well vetted for accuracy. This is especially problematic when it comes to Black communities, because these algorithms are demonstrably less accurate with Black faces. (Details here)
Robert Williams was arrested and hauled away by the police in broad daylight in front of his home, his wife, and two little daughters. He was locked up for nearly 30 hours. He did nothing wrong, but the cops wouldn’t listen. They only cared about what a flawed computer algorithm had to say.
One study by MIT demonstrated that facial recognition software had an error rate of 0.8% for light-skinned men and 34.7% error rate for dark-skinned women. This is a massive disparity and can produce extremely biased conclusions for law enforcement uses. (More here)
Robert is likely not the first person to be wrongfully arrested or interrogated based off of a bogus face recognition hit. There are likely many more people like Robert who don’t know that it was a flawed technology that made them appear guilty in the eyes of the law.
When you add racist and broken technology to a racist and broken criminal legal system, you get racist and broken outcomes. Face recognition technology should play no role in the over-policing of Black and Brown communities.
> Read more about biometrics, identity and steps to improve algorithmic accuracy and ensure proper usage.