Published on MediaPost, 22 November 2019.
Two weeks ago, David Heinemeier Hansson, the inventor of the web application framework Ruby on Rails, tweeted about a concerning incident. He and his wife Jamie had both applied for the Apple Card. But, despite the fact that they file joint tax returns and live in a community-property state, and despite the fact that she has a higher credit score than he does, David was given a credit limit 20x higher than that of his wife.
His tweetstorm set off a bit of a kerfuffle, with plenty of others sharing similar experiences. One man replied, “Just read this thread. My wife has a way better score than me, almost 850, has a higher salary and was given a credit limit 1/3 of mine.” A woman shared, “I’m a physician with good income, 45 years of excellent credit, score in the mid 800’s, and was DENIED.” And a guy by the name of Steve Wozniak tweeted, “I'm a current Apple employee and founder of the company and the same thing happened to us (10x) despite not having any separate assets or accounts.” (Emphasis mine.)
Lena Felton wrote a story about it for The Lily called, Can a credit card be sexist? Her answer should not evoke even the slightest bit of surprise: of course it can.
It’s an algorithm. And algorithms carry the bias of the data they are fed. Credit cards can be sexist. As can hiring algorithms and image search algorithms. As facial recognition algorithms and recidivism algorithms can be racist.
So, yeah, it’s totally unsurprising that the Apple Card is sexist. Felton references Meredith Whittaker, a research scientist at New York University and co-founder of the AI Now Institute: “The first step in combating these patterns is acknowledging that bias does exist in algorithms — that they’re not simply ‘neutral and objective.’”
Bias does exist in algorithms, but it doesn’t exist because of algorithms. Algorithms themselves aren’t biased; they embody the biases of the system from whence they come. As one of the replies to David’s original tweetstorm said, “I remember learning at Amex Corp (re creating credit scores) that ‘being female’ was an automatic downtick which required several alternate category upticks to get the same ranking as a male with the same initial entries. When I questioned it they said it was based on ‘research’”.
The “research” informs the sexism, the sexism informs the algorithms, the biases reinforce themselves, and the system becomes ever more entrenched.
Except…
Right now, we might have the best opportunity in history to interrupt the cycle.
A July 2018 article in the Harvard Business Review questioned the dogma that algorithms are necessarily biased:
“[T]here is a pattern among these critics, which is that they rarely ask how well the systems they analyze would operate without algorithms… Rather than simply asking whether algorithms are flawed, we should be asking how these flaws compare with those of human beings.”
It goes on to say, “There is a large body of research on algorithmic decision making that dates back several decades. And the existing studies on this topic all have a remarkably similar conclusion: Algorithms are less biased and more accurate than the humans they are replacing.” (Emphasis mine.)
Calling out algorithmic bias, as David did, performs the essential function of highlighting where historic injustices are being sustained and amplified by algorithms. But the good news is that it is possible to correct algorithms. It is much harder to correct human biases. And when we correct algorithms, the new normal gets perpetuated.
Forget trying to make our algorithms generate the same answers that would have been generated without them. Let’s take advantage of this once-in-a-lifetime opportunity — and make them aspirational.
Ngā mihi mahana,
Kaila
Certified Dare to Lead™ Facilitator
Co-founder, Boma Global // CEO, Boma NZ