Jamie Heinemeier Hansson had a better credit score than her husband, tech entrepreneur David. They have equal shares in their property and file joint tax returns.
Yet David was given permission to borrow 20 times the amount on his Apple Card than his wife was granted.
The situation was far from unique. Even Apple’s co-founder Steve Wozniak tweeted that the same thing happened to him and his wife despite having no separate bank accounts or separate assets.
The case has caused a stink in the US. Regulators are investigating. Politicians have criticised Goldman Sachs, which runs the Apple Card, for its response.
What the saga has highlighted is concern over the role of machine-learning and algorithms – the rules of computer calculations – in making decisions that are clearly sexist, racist or discriminatory in other ways.
Society tends to assume – wrongly – that computers are impartial machines that do not discriminate because they cannot think like humans.
The reality is that the historic data they process, and perhaps the programmers who feed or create them, are themselves biased, often unintentionally. Equally, machines can draw conclusions without asking explicit questions (such as discriminating between men and women despite not asking for gender information).
How are our lives affected?
A whole range of issues in our daily lives have been…