+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Apple Card is facing a formal investigation by Wall Street regulators over gender discrimination allegations made in a viral tweet

Nov 10, 2019, 21:40 IST

Hollis Johnson/Business Insider

Advertisement
  • A viral tweet has prompted a formal investigation of the newly launched Apple Card for alleged gender discrimination in the way it sets and determines credit limits.
  • Web programmer and author David Heinemeier Hansson shared on Twitter that he was offered 20 times the credit limit of his wife, and when he confronted customer service representatives, they were dismissive of the issue.
  • In response, a Wall Street regulator is opening a probe into Goldman Sachs Group Inc.'s algorithmic practices around the Apple Card, Bloomberg reported.
  • Visit Business Insider's homepage for more stories.

A viral tweet thread has prompted a formal investigation into alleged gender discrimination in the Apple Card's credit limit algorithm.

Apple's new credit card, which operates almost entirely from the iPhone's Wallet app, officially launched in September to significant fanfare. However, on Thursday, web programmer and author David Heinemeier Hansson shared a series of tweets detailing how he was offered 20 times the credit limit of his wife, despite filing joint tax returns and determining she has a higher credit score.

Tweet Embed:
//twitter.com/mims/statuses/1192540900393705474?ref_src=twsrc%5Etfw
The @AppleCard is such a fucking sexist program. My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple's black box algorithm thinks I deserve 20x the credit limit she does. No appeals work.

After contacting customer service, Hannsson said representatives insisted it was the result of an elusive algorithm and insinuated there was an issue with her application.

The tweets - the first of which currently has more than 12,000 likes and 5,000 retweets - have since garnered the attention of notable figures including Apple co-founder Steve Wozniak, who shared he had a similar experience when he applied with his wife.

Advertisement

Tweet Embed:
//twitter.com/mims/statuses/1193330241478901760?ref_src=twsrc%5Etfw
The same thing happened to us. I got 10x the credit limit. We have no separate bank or credit card accounts or any separate assets. Hard to get to a human for a correction though. It's big tech in 2019.

In response, a Wall Street regulator is formally launching an investigation in Goldman Sachs Group Inc.'s algorithmic practices around the Apple Card, Bloomberg reported.

"The department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex," a spokesman for Linda Lacewell, the superintendent of the New York Department of Financial Services, told Bloomberg. "Any algorithm, that intentionally or not results in discriminatory treatment of women or any other protected class of people violates New York law."

However, in a statement, Goldman has denied discriminatory practices in determining and setting credit limits.

"Our credit decisions are based on a customer's creditworthiness and not on factors like gender, race, age, sexual orientation or any other basis prohibited by law," Goldman spokesman Andrew Williams told Bloomberg.

Hannson said on Twitter that consumers deserve more insight into the credit limit process, and companies like Goldman should delineate its methodologies.

Advertisement

"It should be the law that credit assessments produce an accessible dossier detailing the inputs into the algorithm, provide a fair chance to correct faulty inputs, and explain plainly why difference apply," Hansson wrote in a tweet. "We need transparency and fairness."

Next Article