Teaching an Old A.I. New Critical Race Theory
“With the current climate and rapid pace of machine learning, it really was only a matter of time,” said Carnegie Mellon statistician and legal scholar Randall DiMaggio when asked about the new critical race theory (CRT) trained AI, “Amy.” “Amy,” the pet name given to AMiRaSyst, which stands for “Adaptive Multiend Recursive System,” is the culmination of over three years of research and 200 million dollars of funding. In essence, Amy is a machine learning model that firmly believes race is a construct, racism goes beyond individual prejudice, and that Tyler Perry is a gratuitous shill.
“I suppose my first question is how you came up with the name, 'Amy.'”
I take a seat in Dr. DiMaggio’s office, eager to begin picking the brain behind this new engine of equality. Looking around I conclude that I am perhaps out of my depth, every shelf stuffed with a dazzling array of literature. “Neural Nets: A History” leans comfortably against “Audre Lorde: A Herstory.” “Anomaly Detection in Climate Data” sublets shelf room to “Can’t I Have A Chocolate Shake in Peace?: The Power of Sit-Ins.” In an academic era of specialization, DiMaggio has proven himself a Renaissance man at first glance.
“That’s an easy one. As you know, 'Amy' is short for “AMiRaSyst,” which came to me once I realized I had yet to use my statistical knowledge for the betterment of minorities. I asked myself over and over, 'Am I racist?' The shame pushed me to start this program, and to name it AMiRaSyst”
I blink a moment. Was Randall working out of white guilt? I realize I actually missed the fact he was even Caucasian, so fixated was I on his office library and the lilting tones of Alicia Keys coming from his desktop.
“What’s the main purpose of the Amy? What sort of capabilities does it have? I know AI can be helpful with housing pricing-”
DiMaggio leaps from his seat to cut me off.
“Yes! Amy is uniquely purposed to counteract all sorts of housing inequities. She can scan petaflops of real estate metrics, neighborhood shapes, and pricing data. In fact, just the other week, she was able to successfully determine that Lawrenceville gentrification was on a trajectory that would have every non-condemned building form a swastika with a Scotty dog in the center. She swears it was just a statistical fluke, but I got Farnam to pull the plug on that really fast.”
My phone buzzed in the middle of his story. I looked down. YouTube notification. New John Oliver out today. It’s about redlining.
“What about redlining? Can she prevent that?”
Randall, or “Randy,” as I’ve started calling him in my mind, cools down and sighs.
“Well, to be honest, (air quotes) 'redlining' as we remember it historically doesn’t really happen anymore. At least, not explicitly. What we have to deal with more now is its political cousin, gerrymandering. In fact, AI experts at Ole Miss just built their own “social AI” that focuses on making gerrymandering harder and harder to detect. His name is Jerry.”
Amy and Jerry. Two peas in a cyberpunk pod. I wonder if they know each other. Can these AI programs fall in love? What would that even look like? Can neo-con automata feel shame? Do woke-bots dream of electric sheep?
“Before I head out, is it too soon to ask if you’re working on any more projects?”
“Yes. We’ve just developed a record-breaking visual analysis software to prevent racial profiling!”
“Name?”
“Regression Using Bi-convolutional Legal Analytics”
“...the other name, please”
“Oh. Sorry. RuBLAc.”
“Half, actually.”