A High-Tech Bias in Policing, Something Blue That’s Not New

Stop, Frisk, and Fix Facial Recognition

The verdict is in on facial recognition technology, and it is biased toward white men.

This tool of the trade was shown to be 99% accurate for white men and up to 35% inaccurate for Black women.

Facial recognition technology in the United States is yet another tool in the toolbox that prefers white men. This technology has been calibrated and cued to whiteness and maleness and it’s wrong.

It can do the bidding of white supremacy with a click of a button. It can misidentify Black people, it can erase us by not seeing us, and it can falsely accuse us.

It’s all very simple, if you put bias in, the bias will come out too.

And one place that bias is running and ruining is law enforcement.

Use by Police Departments

One report says as many as 25% of police departments, from coast to coast, have this technology.

But the truth is, we don’t know how many police departments use facial recognition technology. Police departments aren’t forthcoming with this information.

But with this technology, police departments can now live out J. Edgar Hoover’s wildest wet dreams.

Right now, some police departments have cameras in neighborhoods to take snapshots of faces.

The police can also use their cellphones to take pictures and compare faces to databases.

This may all look like a good upgrade but it has real risks and side effects.

Some critics say this technology should not be used for arrests. And yet the police have misidentified and arrested people.

The critics say, this technology should supplement other evidence, and to me, that makes sense.

So, I have no choice but to issue an All-Points Bulletin to all branches of law enforcement.

They must get eager and advanced about rooting out, and not rooting for, racial bias in their ranks and technological files.

And, with the margin of error, public concern, and possible lawsuits, this technology does not make anyone’s job easier.

Unregulated Racial Bias

It’s a fact Black people are more likely to be subjected to this technology. A technology that misidentifies or doesn’t recognize Black faces, especially Black women.

Given the racial bias already in law enforcement, all of this should be a great cause for alarms and a pause button.

This technology takes the phrase “you match the description” to a new and automated level.

The fact is, this technology is not spot-on, it has many spots and blemishes.

It’s less accurate than other biometric identifiers like fingerprinting and iris scanning, and yet it’s used.

And the police departments, the programmers, and the manufacturers aren’t concerned with the racial bias.

There’s nothing to force them to care because the technology is hardly regulated.

No one is required to perform tests to detect bias, reduce bias, or perfect the technology.

There are real dangers to privacy in this technology if not managed.

Now, this technology has potential, but it must be right and it must be regulated. Otherwise, this technology is wrong for right now.

What do you think the chances are that the current Justice Department or White House will provide any safeguards or oversight?

I’m doubtful, especially since the White House recently said police shootings are local matters for local authorities.

How to Fix the Face

And, the thing is, facial recognition technology doesn’t have to be this way.

In Asian countries, this technology is better at recognizing Asian faces because that’s how it’s coded.

So, in the multiracial, gender-fluid, melting pot of the United States, we must do better. The problem is people think this is a white country and they want to use white things on subjects of color. But, all of that has to stop now.

It’s been said over and over that the tech industry must hire more people of color to include women.

The industry must conduct outreach and do everything possible to recruit and provide incentives to add talent.

Put simply, this technology responds to what it’s fed.

And despite the fact that Black people in this country are fed up, the tech industry hasn’t had enough.

So, I’ll say it again — we are fed up with being misidentified.

We are fed up with being confused with other people of color.

We are fed up with being repeatedly forced to insist on our humanity.

So, stop inventing new ways to wrong people and get people wrong!

I will repeat — technological advancements do not leave bias, racism, misogyny, and white supremacy behind. None of those issues are in need of more sophistication.

Does anyone have room on their list of societal isms and ills for ‘cyborg racism’, ‘digital profiling,’ or ‘artificial racism?’ No, no, no, I’m out of space and they’re out of their minds.

Keeping Our Eyeballs on the Ball

I’d rather we stop, fix, and regulate this technology to reduce the bias. They must go back to the drawing board and let us approve their image of us. This is our faces!! We can’t just let any fool fool with our faces.

Our nation needs a full-court press and all hands on deck to eradicate racism, bias, and white supremacy from every industry and facet of life.

But at the top of the list must be the very area the Trump administration wants to let loose like a dove in the sky — law enforcement.

Both the police departments and software makers must be scrutinized and held accountable to a rigorous standard.

Because until then, this isn’t facial recognition technology. It’s a technology that’s unfair racially, it’s in our faces, and it must be recognized.

People must be seen and recognized by people first, then we can add technology.

In this case, technology will not help people to see better. This is machinery, it’s not a miracle, magic, or medicine.

There’s no LASIK for bias and racism, not yet, not today. So, these astigmatisms people must correct and not put into machines.

But, we all know, people will never see what they don’t want to see. And, I know they say it’s the definition of insanity, but I guess we have to keep trying.

For my latest content, sign up for my newsletter, On Equal Terms.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store