After she and a friend took a child’s bike and scooter that were sitting outside, Sade Jones—who had never been arrested before—was rated a “medium risk” for future crime by AI software

After she and a friend took a child’s bike and scooter that were sitting outside, Sade Jones—who had never been arrested before—was rated a “medium risk” for future crime by AI software

By any measure, we are living through a golden age of journalism and the impact is palpable. But as we enter the third era of computing—otherwise known as artificial intelligence—it’s more important than ever for journalists to push beyond our misplaced optimism and fears and uncover the hard truths about the future we’re building. This is why ProPublica’s “Machine Bias” series was both fascinating and fundamental. Investigating algorithms, the data sets, the artificial intelligence (AI) frameworks, and the people whose values already live inside our machines is impossibly difficult work. ProPublica was able to show what we all knew anecdotally: that our AI systems are riddled with bias, automated bias leads to real-world social and economic injustices, and that there’s no real effort in place to fix existing fundamental flaws. This team proved why transparency is paramount as we ask machines to do our thinking for us.

Machine Bias

By Julia Angwin, Jeff Larson, Surya Mattu, and Lauren Kirchner
ProPublica, May 23, 2016

Excerpt

On a spring afternoon in 2014, Brisha Borden was running late to pick up her god-sister from school when she spotted an unlocked kid’s blue Huffy bicycle and a silver Razor scooter. Borden and a friend [Sade Jones] grabbed the bike and scooter and tried to ride them down the street in the Fort Lauderdale suburb of Coral Springs.

Just as the 18-year-old girls were realizing they were too big for the tiny conveyances—which belonged to a 6-year-old boy—a woman came running after them saying, “That’s my kid’s stuff.” Borden and her friend immediately dropped the bike and scooter and walked away.

But it was too late—a neighbor who witnessed the heist had already called the police. Borden and her friend were arrested and charged with burglary and petty theft for the items, which were valued at a total of $80.

Compare their crime with a similar one: the previous summer, 41-year-old Vernon Prater was picked up for shoplifting $86.35 worth of tools from a nearby Home Depot store.

Prater was the more seasoned criminal. He had already been convicted of armed robbery and attempted armed robbery, for which he served five years in prison, in addition to another armed robbery charge. Borden had a record, too, but it was for misdemeanors committed when she was a juvenile.

Yet something odd happened when Borden and Prater were booked into jail: A computer program spat out a score predicting the likelihood of each committing a future crime. Borden—who is black—was rated a high risk. Prater—who is white—was rated a low risk.

Two years later, we know the computer algorithm got it exactly backward. Borden has not been charged with any new crimes. Prater is serving an eight-year prison term for subsequently breaking into a warehouse and stealing thousands of dollars’ worth of electronics. 

Most popular articles from Nieman Reports

Show comments / Leave a comment