How Truleo Achieves Unbiased Analysis
What is Truleo?
Truleo is a body camera audio analytics platform that surfaces risky officer behavior. By analyzing BWC audio, we create "baseball card stats" for police that demonstrate the percent of police interactions that are neutral, positive or negative. For those that have surfaced as negative, Truleo's reports explain why.
What is bias in AI?
Humans often have trouble understanding unfamiliar accents and therefore have bias. In numbers, this translates to humans misunderstanding 5%-15% of another person’s speech simply because they are unfamiliar, which results in bias.
Machines have the ability to learn from a variety of accents, but many speech technologies have not been trained on diverse data and therefore exhibit bias. When bias is present in AI, voice recognition systems exhibit higher error rates when transcribing voices of speakers that fall in less common categories (age, gender, race). For a model to be unbiased, there must be no significant differences (less than 5% variance) between accuracy of transcription based on voice characteristics.
In two recent studies*, widely used speech recognition systems were shown to be significantly less likely to understand non-native accents than those of native-born users, largely due to lack of diverse training data. The error rate disparity can pose many implications if not mitigated.
Why is bias in AI problematic?
Bias in AI can have many consequences. In public safety in particular, bias can result in inconsistent and unfair evaluations of officers and civilians based on their race, gender or age. Speakers of non-caucasian races can be deemed less respectful or less compliant simply due to increased error in voice recognition.