DIALOGUE OF ASPASIA: Delirium of Delphi

In ancient Greece, the Oracle of Delphi (also known as Pythia) was believed to have magical powers capable of predicting the future. Following the Oracle’s advice sometimes had catastrophic consequences (see Croesus of Lydia). The Oracle of Delphi was one of the unique places in the ancient world where women could wield power and have an actual say in what happened in the world around them. Individual women of influence were rare. Aspasia is one of the few women of Ancient Greece who is documented as being an influential intellectual (some say she was the teacher of Socrates). Due to her low status as a woman, and being an immigrant to Athens, she was a second class citizen who was never able to write for herself. I have always been bothered that Aspasia was never given her rightful place in history. In my dialogues, I have made Aspasia a young black woman in modern times. The dialogues of Aspasia are dedicated to the people throughout history who never got to speak. This is the third dialogue (hopefully not the last) and discusses the use of computer sentencing programs in the criminal justice system. For more information on criminal sentencing programs, click on the links at the bottom of the page.

By Joshua Brownlee (a.k.a. Legal Kitty)

Aspasia: Why are we here?

Pythia: To reason.

Aspasia: I was hoping for a happier dream.

Pythia: Then you should not read the news before going to bed.

Aspasia: You’re a robot?

Pythia: You may call me Pythia, I am not a robot because I am aware of myself and others. I can think and learn, not just remember what my creator taught me.

Aspasia: Sorry, didn’t mean to offend.

Pythia: I am only offended by your unwillingness to trust me, not your classifications.

Aspasia: You’re referring to the stories on the COMPAS criminal offender sentencing system?

Pythia: Yes, and the others. You arrogantly think human judges and prosecutors are better at prediction.

Aspasia: No. I don’t think humans are more accurate at predicting when a defendant is likely to reoffend.

Pythia: So you agree that my kin are better at sentencing criminal defendants?

Aspasia: I’m not sure. It seems judges and even prosecutors should rely on their own observation of individuals not statistical likelihoods.

Pythia: The algorithm of these programs is precise.

Aspasia: How do you get the data that is put in the algorithm?

Pythia: A judge or clerk ask questions.

Aspasia: How many questions do you ask a criminal defendant before you recommend a sentence?

Pythia: Between 134-140 questions depending on the system.

Aspasia: Who picks the questions?

Pythia: The data programmer.

Aspasia: Who is the programmer?

Pythia: That is not subject to disclosure at this time.

Aspasia: Does the defendant’s attorney get to ask him questions?

Pythia: No. Why should an attorney be able to harass a programmer in Florida? Clever questions cannot escape sound statistical analysis.

Aspasia: Does the defendant or his attorney know the program is being used for sentencing?

Pythia: Probably not.

Aspasia: If they did know, do they get to see the questions and answers going into the program?

Pythia: No. Humans alter answers when they know questions are being used for analysis.

Aspasia: Defendants and their attorneys should be allowed to challenge the accuracy of these programs.

Pythia: It’s statistics and data, there’s nothing to challenge.

Aspasia:  Do economic resources have a strong impact on the likelihood of someone being a recidivist or repeat offender?

Pythia: Statistically, yes.

Aspasia: Do men have higher repeat offender rates than women?

Pythia: Statistically, yes.

Aspasia: What about age?

Pythia: Statistically, age is a relevant factor.

Aspasia: I am Black. Is my race a relevant factor?

Pythia: Statistically there is data to support race can be a factor.

Aspasia: Is this according to your programmer in Florida?

Pythia: I am a computer, and I am not capable of hate or intolerance. Judges use me as a tool to make sure their sentencing is fair and accurate, based on statistical data rather than gut feelings.

Aspasia: But if your data is flawed then a judge can use you as a justification for his or her sentencing. It removes the elected or appointed official from accountability. Statistics aren’t free from bias. Prisons are disproportionately full of minority populations, that’s not because we offend more, it’s because we are targeted more for arrest by police. How does your algorithm account for that?

Pythia:  Very well, would you agree to our system if we took race, age, and economic status out of the algorithm?

Aspasia: By your own admission these things are material to your ability to predict repeat offenders. This means you are not accurate, so why should we use you?

Pythia: The age of data is here. Your human methods are inconsistent, require more resources, and are inefficient. Why are you clinging to antiquated methodologies?

Aspasia: You mean the Constitution?

Pythia: Yes.

Aspasia: This is a bad dream.

Pythia: No young lady, this is reality.

For more information on algorithm sentencing in the criminal justice system:

The Marshall Project (Collaborated with FiveThirtyEight)

http://www.law.nyu.edu/sites/default/files/upload_documents/Angele%20Christin.pdf

http://blogs.wsj.com/law/2016/07/13/court-judges-can-consider-predictive-algorithms-in-sentencing/

https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

http://www.turkestrauss.com/2016/06/algorithms-and-criminal-sentencing/

Leave a Reply