Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

appalachiablue

(41,146 posts)
Thu May 27, 2021, 03:33 AM May 2021

Judges Now Rely On Secret AI Algorithms To Guide Their Sentencing of Criminal Defendants

Daily Kos, May 25, 2021. 'Sentenced By Algorithm.'

Imagine for a moment that you’ve been convicted of a crime, and are awaiting sentencing. The prosecutor hands a computer-generated analysis to the judge that declares, based on a secret analysis performed by a complex algorithm, that you should receive the harshest possible sentence, since according to the algorithm you are highly likely to commit future crimes.
Your attorney, hoping to rebut this conclusion, asks how the report was prepared, but the judge rules that neither you nor your attorney are entitled to know anything about its preparation, only the results. The judge then proceeds to impose the maximum sentence, based on this secret calculation.

If that sounds like something out of a dystopian science fiction novel, well, it’s going on right now in several states throughout this country.

Jed Rakoff is a federal district judge for the Southern District of New York. A former federal prosecutor appointed to the bench in 1996, Rakoff has presided over the some of the most significant white-collar crime cases in this country. He is generally recognized as one of the leading authorities on securities and criminal law, and as a regular contributor to the New York Review of Books, he often writes about novel and emergent criminal justice issues. His latest essay addresses the increasingly widespread use by criminal prosecutors of artificial-intelligence-based (AI) computer programs or algorithms to support sentencing recommendations for convicted criminal defendants.

These programs, using a variety of controversial sociological theories and methods, are primarily used to assess recidivism (the propensity of a defendant to commit future crimes) and they are often given heavy weight by judges in determining the length of the sentence to be imposed. They also factor in decisions regarding setting bail or bond limits. The consideration of potential recidivism is based on the theory of “incapacitation:” the idea that criminal sentencing should serve the dual purpose of punishment as well as preventing a defendant from committing future crimes, in order to protect society.

Rakoff finds the use of these predictive algorithms troubling for a number of reasons, not the least of which are their demonstrated error rates and propensity for inherent racial bias. He notes that the theories on which they purportedly analyze a person’s propensity to commit future crimes are often untested, unreliable, and otherwise questionable.
However, his most recent essay for the NYRB, titled “Sentenced by Algorithm” and reviewing former district judge Kathleen Forrest’s, When Machines Can Be Judge, Jury, and Executioner, implicates even more disturbing questions raised by the introduction of artificial intelligence technology into our criminal justice system.

Is it fair for a judge to increase a defendant’s prison time on the basis of an algorithmic score that predicts the likelihood that he will commit future crimes? Many states now say yes, even when the algorithms they use for this purpose have a high error rate, a secret design, and a demonstrable racial bias...

More,
https://www.dailykos.com/stories/2021/5/25/2031872/-Judges-now-rely-on-secret-AI-algorithms-to-guide-their-sentencing-of-criminal-defendants

8 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Judges Now Rely On Secret AI Algorithms To Guide Their Sentencing of Criminal Defendants (Original Post) appalachiablue May 2021 OP
In a similar vein... regnaD kciN May 2021 #1
That's wild, but I can see companies claiming appalachiablue May 2021 #2
The only thing new there is the algorithm /nt bucolic_frolic May 2021 #7
Minority Report comes to mind. Not in a good way. yonder May 2021 #3
Same thought berni_mccoy May 2021 #4
How is this different from the existing method? FBaggins May 2021 #5
More traditional sentencing guidelines are often set by statute or other public policy. Eugene May 2021 #8
Secret design bucolic_frolic May 2021 #6

regnaD kciN

(26,044 posts)
1. In a similar vein...
Thu May 27, 2021, 04:02 AM
May 2021

…there’s now an insurance company, named “Lemonaid,” that requires its clients who need to file a claim to provide a video statement about the incident that they can then analyze with their top-secret algorithm to determine if the client is lying. They claim it’s of great benefit to the company in preventing losses due to fraud; I’m wondering why anyone would choose an insurance company that basically states in advance that they are going to bend over backwards to find any basis for denying your claim.

appalachiablue

(41,146 posts)
2. That's wild, but I can see companies claiming
Thu May 27, 2021, 04:14 AM
May 2021

they have to follow in suit in order to 'remain competitive,' and not be at a disadvantage- or some other cover.

It's fast becoming a very strange world...

FBaggins

(26,748 posts)
5. How is this different from the existing method?
Thu May 27, 2021, 05:57 AM
May 2021

Sentencing has long included a point system of aggravating factors and sentencing guidelines. Using a computer (silly to call it AI) to do the math doesn’t change that.

Such calculations are intended to make sentencing less open to racial bias... not more.

There are no doubt “errors” - but judges make errors too...

Eugene

(61,900 posts)
8. More traditional sentencing guidelines are often set by statute or other public policy.
Thu May 27, 2021, 07:21 AM
May 2021

Computer algorithms are based on expectations and assumptions of the designers. If the data are faulty or the assumptions are prejudiced, the garbage results can now be cloaked as proprietary "trade secrets."

Facial recognition is an example of tech that is poorly tuned to persons of color and sometimes incompetently used. Garbage in garbage out.



bucolic_frolic

(43,182 posts)
6. Secret design
Thu May 27, 2021, 06:00 AM
May 2021

By historical statistics? Which would be a formula for racial bias. AI is based on theory and numbers somewhere. Programmers make the code. If I were a judge this would be an interesting sidelight, an awareness only.

So we have driver-less cars that crash, AI mutual funds and asset allocation based on indexing and rebalancing, and now AI judgments to make "safe" judicial decisions. Oh yeah, we have sports management based on metrics to strip the heart out of sports.

BUT, elections are rigged by bamboo sticks in the paper ballots.

Latest Discussions»Issue Forums»Editorials & Other Articles»Judges Now Rely On Secret...