Later On

A blog written for those whose interests more or less match mine.

When an algorithm taps you on the shoulder

leave a comment »

Matt Stroud reports in the Verge:

ROBERT MCDANIEL’S TROUBLES began with a knock on the door. It was a weekday in mid-2013, as he made lunch in the crowded three-bedroom house where he lives with his grandmother and several of his adult siblings.

When he went to answer the door, McDaniel discovered not one person, but a cohort of visitors: two police officers in uniform, a neighbor working with the police, and a muscular guy in shorts and a T-shirt sporting short, graying hair.

Police officers weren’t a new sight for McDaniel. They often drove down his tree-lined street in the Austin neighborhood of Chicago making stops and arrests. Out of the 775 homicides tracked by the Chicago Sun-Times in 2020, 72 of them happened in Austin. That’s almost 10 percent of the city’s murder rate, in a region that takes up just 3 percent of its total area. The City of Chicago puts out a “heat map” of where gun crimes occur, with areas of moderate shooting numbers shaded in blue or green. Red splotches represent large numbers — and hottest concentrations — of shootings. On the map, Austin is the color of a fire engine.

Still, this visit from authorities caught McDaniel off guard: at that point in time, he had nothing remotely violent on his criminal record — just arrests for marijuana-related offenses and street gambling. And despite two officers showing up at his front door with the cohort, neither of them, nor anyone else in the cohort, accused McDaniel of breaking the law. They were not there to arrest him. No one was there to investigate a crime. They just wanted to talk.

“I had no idea why these cops were here,” McDaniel says, recounting it to me years later. “I didn’t do shit to bring them here.”

He invited them into this home. And when he did, they told McDaniel something he could hardly believe: an algorithm built by the Chicago Police Department predicted — based on his proximity to and relationships with known shooters and shooting casualties — that McDaniel would be involved in a shooting. That he would be a “party to violence,” but it wasn’t clear what side of the barrel he might be on. He could be the shooter, he might get shot. They didn’t know. But the data said he was at risk either way.

McDaniel was both a potential victim and a potential perpetrator, and the visitors on his porch treated him as such. A social worker told him that he could help him if he was interested in finding assistance to secure a job, for example, or mental health services. And police were there, too, with a warning: from here on out, the Chicago Police Department would be watching him. The algorithm indicated Robert McDaniel was more likely than 99.9 percent of Chicago’s population to either be shot or to have a shooting connected to him. That made him dangerous, and top brass at the Chicago PD knew it. So McDaniel had better be on his best behavior.

The idea that a series of calculations could predict that he would soon shoot someone, or be shot, seemed outlandish. At the time, McDaniel didn’t know how to take the news.

But the visit set a series of gears in motion. This Kafka-esque policing nightmare — a circumstance in which police identified a man to be surveilled based on a purely theoretical danger — would seem to cause the thing it predicted, in a deranged feat of self-fulfilling prophecy. . .

Continue reading. There’s much more, and story gets even more interesting. The “help” offered causes the problem it was intended to prevent.

Later in the article, Stroud points out one weakness built into the system:

Forecasting isn’t magic; it’s an educated guess about what might happen based on things that have already occurred. The data feeding forecasting software for police are typically built around police stops and arrests. That might sound straightforward and unbiased, but consider that US Department of Justice data show that African Americans are more than twice as likely to be arrested than white people. And if you’re Black, your likelihood of being stopped by a police officer can be nearly four times higher than if you’re white, depending on which city you live in, according to the Stanford Open Policing Project.

Building a forecasting model around data like these can run the risk of stigmatizing entire populations based on discriminatory data; a 2017 study from the Journal of Statistics and Public Policy found that arrests doubled in a quadrant of Los Angeles where its police department tested forecasting software. Another problem — exacerbated when forecasting programs do not disclose their sources of data — is that of “dirty data” being mixed with more straightforward crime reports: a 2019 study out of New York University’s AI Now Institute identified jurisdictions where inaccurate or falsified records were directly fed into the data. Chicago’s one of them.

Which is all to say that forecasting can put entire populations at risk of over-policing — which has led to countless unnecessary police killings for relatively insignificant infractions. (Think George Floyd. And Michael Brown. Twelve-year-old Tamir Rice. Sandra Bland, Philando Castile, Walter Scott. Thirteen-year-old Adam Toledo, this year, in Chicago. Alton Sterling, Breonna Taylor, Ahmaud Arbery. The list goes on.)

Later still:

IN MCDANIEL’S VIEW, the heat list caused the harm its creators hoped to avoid: it predicted a shooting that wouldn’t have happened if it hadn’t predicted the shooting.

As the heat list continued to operate, researchers tore it to shreds. A 2016 paper published in the Journal of Experimental Criminology came to some troubling conclusions about the list that had, by then, been rebranded as the “Strategic Subject List,” or SSL. Among them: “The individuals on the SSL were considered to be ‘persons of interest’ to the CPD,” meaning that McDaniel’s description of being routinely targeted for surveillance and searches matched what researchers discovered. “Overall,” the report goes on, “there was no practical direction about what to do with individuals on the SSL, little executive or administrative attention paid to the pilot, and little to no follow-up with district commanders.”

The heat list wasn’t particularly predictive, it turned out. It wasn’t high-tech. Cops would just use the list as a way to target people.

There was another problem, too. . .

Written by Leisureguy

24 May 2021 at 1:41 pm

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: