Using AI to analyze prison phone calls could amplify racial biases in policing

Plans to expand use of the tech have provoked alarm

AI biases

Speech recognition software is notorious for misinterpreting Black speakers.

A2020 Stanford University studyfound that speech-to-text systems used by Amazon, Google, Apple, IBM, and Microsoft had error rates that were almost twice as high for Black people as they were for white ones.

These disparities could reinforce racial disparities in the criminal justice system. Research shows thatBlack men in the US are six times as likely to be incarcerated as white men

Albert Fox Cahn, the executive director of the Surveillance Technology Oversight Project, told Reuters that the tech could “automate racial profiling” and violate privacy rights:

Prisons aren’t exactly designed to protect the privacy of inmates, but the AI system is a worrying addition to the surveillance regime.

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to itright here.

Story byThomas Macaulay

Thomas is a senior reporter at TNW. He covers European tech, with a focus on AI, cybersecurity, and government policy.Thomas is a senior reporter at TNW. He covers European tech, with a focus on AI, cybersecurity, and government policy.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with

More TNW

About TNW

Tech bosses think nuclear fusion is the solution to AI’s energy demands – here’s what they’re missing

Samsung completes €86M takeover of French AI ultrasound startup Sonio

Discover TNW All Access

Europe has opened a door to a universal wallet. The web’s inventor wants to enter

German startup OroraTech raises €25M to scale wildfire early warning system