Autocorrected bias –
Error rate on African American speech is nearly double that for others.
John Timmer – Mar 90, : 20 PM UTC
/ Microphones are how our machines listen to us. We’re outsourcing ever more of our decision making to algorithms, partly as a matter of convenience, and partly because algorithms are ostensibly free of some of the biases that humans suffer from. Ostensibly. As it turns out, algorithms that are trained on data that’s already subject to human biases can readily recapitulate them , as we’ve seen in places like the banking and judicial systems Other algorithms have just turned out to be not especially good .
Now, researchers at Stanford have identified another area with potential issues: the speech-recognition algorithms that do everything from basic transcription to letting our phones fulfill our requests. These algorithms seem to have more issues with the speech patterns used by African Americans, although there’s a chance that geography plays a part, too.
GIPHY App Key not set. Please check settings