Watch CBS News

Does Goldman Sachs' online bank Marcus have an Apple Card gender issue?

Apple and Goldman Sachs face accusations that the algorithms behind the the companies' joint iPhone-based credit card can discriminate against women. But the Apple Card isn't the only Goldman venture that could be ripe for claims of gender bias. 

The investment bank's online banking platform, Marcus, which the Wall Street firm launched a few years ago to cater to middle-income millennials, parses the personal info that goes into its lending algorithm in a similar way as the Apple Card does. 

That's not a surprise. Goldman developed the technology used to approve borrowers for the tech giant's Apple Card, which launched in mid-August. But problems soon cropped up. Tech entrepreneur David Heinemeier Hansson tweeted that he was offered a borrowing limit 20 times higher than his wife received despite her higher credit score. Even more embarrassing, Apple co-founder Steve Wozniak then tweeted that his wife encountered a similar problem.

Presidential hopeful Senator Elizabeth Warren jumped in the fray, saying Goldman's proposed remedy — that women who believe they have been discriminated against should contact the bank — fell short. The onus should be on Goldman to explain how its algorithm works, and if that's not feasible, "they need to pull it down," Warren said.

The state of New York is also investigating. Linda Lacewell, superintendent of the New York Department of Financial Services, said in a post on Medium that she would examine whether Goldman's algorithm violated state bias laws in how it makes credit limit decisions. 

"It's a problem," said University of Berkeley law professor Robert Bartlett, who has studied the issue. "Clearly there is legal risk, even though it's possible that those credit decisions — if ultimately rooted in income and credit scores — are entirely legal."

Apple Card doesn't fall far from lending tree

The controversy comes at a time when a number of tech giants are jumping into the consumer finance industry. Last week, Google announced it would soon begin offering checking accounts.

It also comes as more research suggests that the algorithms these new lenders are using do not eliminate, and in some cases could be adding to, traditional biases against minorities and other groups.

Earlier this month, Bartlett and four Berkeley economics professors released a revised version of their research paper on bias and fintech lenders. The paper found that lenders relying on an algorithm rather than traditional loan underwriting charged African-American and Latino borrowers 0.05 percentage points more in interest a year. Overall, that difference cost minority borrowers $765 million in extra interest per year, the researchers said.

"The problem is not exclusive to Apple," said Adair Morse, one of the paper's co-authors. "Apple and Goldman are not the only ones who have built their algorithms in ways that result in this exact type of disparate treatment by gender." 

Apple Card accused of gender discrimination in its algorithm 03:26

The study focused on mortgage lending and didn't look at either Apple Card or Marcus. But the researchers cite Marcus as a lending platform that could run into the same problems of bias documented in their study.

Goldman said potential bias concerns about Marcus, which has almost $5 billion in loans outstanding, are unfounded.

"Goldman Sachs has not and will never make decisions based on factors like gender, race, age, sexual orientation or any other legally prohibited factors when determining credit worthiness," a Goldman spokesman said in an emailed statement.

Goldman's explanation

Goldman maintains that the allegations of bias derive not from its algorithm, but from a legitimate business decision to only allow individual accounts when applying for loans. 

Marcus, like Apple Card, does not allow joint borrowers or any form of co-borrower or co-signer on a loan. Unlike Apple Card, however, Marcus does allow individuals to list their total household income on their loan application. It's just not easy for applicants to find that option.

A reference to household income on the Marcus website is buried in the sixth slot of a drop-down menu of income sources. The eligibility of household income also is disclosed in the 24th question on the online bank's FAQs page. The question is, "What types of documents are accepted to demonstrate proof of income?" The third paragraph of the answer says that if household income is included one must include documentation. Individual income does not need such verification.

Hansson, who wrote the original tweet that sparked the Apple Card controversy, said his wife was a stay-at-home mom with no direct source of income. 

A Goldman spokesperson explained to CBS MoneyWatch: "We look at an individual's income and an individual's creditworthiness, which includes factors like personal credit scores, how much debt you have and how that debt has been managed. Based on these factors, it is possible for two family members to receive significantly different credit decisions." 

Indeed, in reporting this story, the reporter and his wife both individually applied for a $40,000 loan from Goldman's Marcus online platform. The reporter's wife was approved for a three-year loan of $20,000 with an annual interest rate of 7.99%. The loan application of the reporter, a male, was rejected.

Single borrowers face a disadvantage

The larger problem, experts said, is that only offering individual accounts, based heavily on individual incomes, would likely lead to men being issued higher credit limits with lower interest rates. A 2006 study from the National Community Reinvestment Coalition found that joint male and female borrowers "enjoyed more favorable outcomes than either male and female borrowers" on their own. What's more, NCRC found that individual borrowers were more likely to end up in pricier subprime loans than joint borrowers.

"Typically, the fact that joint borrowers are treated differently than single borrowers creates a disparate impact on African-American women because they are statistically more likely to be single mothers," said NCRC chief executive Jesse Van Tol.

According to the Berkeley study, it's illegal for lenders to create a lending algorithm that results in disparate treatment of minority groups, even if the decision that resulted in the disparate treatment was not intentionally discriminatory. The one exception is if the decision was made for a legitimate business reason, according to the study. 

Biased data in, bias out

A source informed of Goldman's thinking said the reason for not allowing joint Apple Card accounts is because the account is tied to an individual's iPhone, and the bank and Apple thought tying the card to two phone accounts could create a cybersecurity risk. Last Wednesday, though, in the wake of the controversy, Goldman said it would soon introduce the ability for family members to share an Apple Card credit line.

Back in the 1970s, banks were criticized for forcing women to sign credit card applications with their husbands. Sarah Harkness, a University of Iowa sociologist who has studied issues of gender and credit, said forcing women to borrow with a male partner, or banning the practice, can result in gender bias. 

"There is a strong historical component in acknowledging the family unit as source of financial support," said Harkness, adding that the bigger issue with lending algorithms is they tend to reflect a society's historical biases. "If the algorithm is based on credit history data that are biased, then you are going to get disparate treatment no matter what algorithm is used."

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.