News  |  ,   |  February 6, 2020

An Algorithm That Grants Freedom, or Takes It Away

News article by By Cade Metz and Adam Satariano.
Published in The New York Times.

Excerpt:

Across the United States and Europe, software is making probation decisions and predicting whether teens will commit crime. Opponents want more human oversight.

PHILADELPHIA — Darnell Gates sat at a long table in a downtown Philadelphia office building. He wore a black T-shirt with “California” in bright yellow letters on the chest. He had never been to the state, but he hoped to visit family there after finishing his probation.

When Mr. Gates was released from jail in 2018 — he had served time for running a car into a house in 2013 and later for violently threatening his former domestic partner — he was required to visit a probation office once a week after he had been deemed “high risk.”

He called the visits his “tail” and his “leash.” Eventually, his leash was stretched to every two weeks. Later, it became a month. Mr. Gates wasn’t told why. He complained that conversations with his probation officers were cold and impersonal. They rarely took the time to understand his rehabilitation.

He didn’t realize that an algorithm had tagged him high risk until he was told about it during an interview with The New York Times. [ . . . ]