It is a story that has made its way into all the major news outlets. When tech entrepreneur David Heinemeier Hansson applied for an Apple Card his credit limit was 20 times higher than that extended to his wife Jamie even though they share all their financial accounts, she has lived longer in the US and has a higher credit rating.

Not long afterwards Apple’s cofounder Steve Wozniak tweeted that he too received an Apple Card credit limit 10 times higher than his wife. After the story went viral, Goldman Sachs, the bank underwriting the credit card, intervened and adjusted the credit limits while denying that their algorithm which determines the credit limit is biased. Despite these assurances New York’s Department of Financial Services (DFS) has now initiated a probe into alleged gender discrimination. It took DFS only two days to announce the investigation!

These are the facts. What makes this case special is how clear the facts are. No notion of fairness can justify offering a woman 5% of the credit offered to a man if they are a married couple with shared finances. Usually algorithmic bias works in more subtle ways. Anecdotal evidence is often suggestive but only statistical evidence is convincing. For example, the recidivism risk assessment tool COMPAS was used by courts for years until a statistical analysis by Pro Publica showed that is biased against black defendants.

Apple Card is not the only technology with a bias problem. Amazon spent two years trying to build an AI that could sort the CVs of job applicants, but it couldn’t manage to make the AI fair and so it abandoned the effort in 2017. Its AI penalised resumes containing the word ”women’s,” as in “women’s chess club captain” and preferred verbs such as “executed” and “captured” over mentions of actual coding skills. Other companies, however, are undeterred. HireVue is marketing AI-based software that analyses videos of candidates for verbal and non-verbal clues to their suitability for the job role. It is hard to imagine this software being unbiased.

At the same time not all discrimination is algorithmic. Lena Felton’s article puts the incident into the historical context. The Apple Card is not the first time women are discriminated against when it comes to obtaining credit, it is merely the most recent such incident. In the 70s a woman in the UK needed the signature of her father or husband to obtain credit. While those times are gone, past discrimination still impacts women today. The credit score is influenced by past credit history and so reduced ability to obtain credit in the past will lead, statistically, to lower credit scores today. Yesterday’s discrimination creates today’s discrimination. And since today’s credit data is used to train AI models that will make credit decisions tomorrow, unless our approach to AI changes, today’s discrimination will create tomorrow’s discrimination.

Being fair is hard. Apple is one of the big tech companies with deep expertise in AI and Goldman Sachs is a sophisticated bank and yet they got this wrong. Why? In cases such as this it is worth remembering Hanlon’s razor: “Never attribute to malice that which is adequately explained by stupidity.” According to the situation “stupidity” can be replaced by “neglect”, “ignorance” or “laziness” and here we see signs of all three. Goldman Sachs claims that their algorithm only looks at the personal credit history without taking marital status into account. But in a marriage finances are shared and a credit card can be issued in the name of one partner with the other partner using a supplemental card for the same account. This will lead to one partner having a more limited credit history without being a higher risk borrower. And according to an insider: “Goldman was aware of the potential issue before it rolled out the Apple Card in August, but it opted to go with individual accounts because of the added complexity of dealing with co-signers or other forms of shared accounts […]”

Where does it leave us? Apple has released a shiny new credit card and packaged it together with a sexist algorithm for determining the credit limit. No doubt it will serve as a cautionary tale to future students of data science. Both Apple and Goldman Sachs will scramble to fix the problem and they will succeed, at least as far as publicised egregious cases are concerned. Whether they will eliminate bias on a statistical level is both more doubtful and at more difficult to verify. The news cycle will move on. There will be another biased AI to write about.

But one thought remains. What if the David hadn’t applied for an Apple Card? In that case, Jamie still would have received a low credit limit, but she would not know that her credit limit is too low. Because, what would she compare her credit limit against? Thus, in a world where the decision process comes down to “It’s just the algorithm,” women are not only discriminated against, the discrimination is also hidden from them. Unless they can enlist the help of men to discover the fact of discrimination. Besides adding insult to injury, this also illustrates Caroline Perez’s thesis in her book Invisible Women: not only is the world biased against women, often enough the data necessary to show bias is not being collected.