I suspected the root cause of this situation was due to the income signal, not credit score. I bet that Goldman Sachs is not correctly accounting for California community property laws, where 50% of the household income is hers. This seems like the exact type of oversight that is:
- Not “technically” using gender as a signal
- Ends up practically causing gender-based unfairness
- Is simply bad data and I would consider a bug
- Is something customer service is not going to be helpful with
I’m glad this issue came to light. I hope it leads to productive conversations about black-box algorithms and underwriting.
I'm pretty sure household income is a federal thing coming from the Credit Card Act of 2009, not just CA community property laws. Every time I apply for a credit card it's been pretty clear they want household and nonwage income, not a solo salary.
Either way, it's baffling and I think this explains why my wife got such a piss poor credit limit on her Apple Card, something like 1/5 of another line of credit she has. I didn't even bother applying after seeing what she got.
The joke...? is on us though because we're both female.
I assume they pull income from a third party source to supplement the self-reported income. The third party income would be the source of the discrepancy.
> I bet that Goldman Sachs is not correctly accounting for California community property laws
So what you are saying is the GS is conducting banking without following all regulations.
Color me surprised.
I am so tired of financial companies, big and small, flouting regulations because they think they know better. Regulations, especially financial ones, exist for a reason.
You make fair points, but the conclusive "There's no endpoint to meet your request" is simply unacceptable. Are Profit margins maximized by inconveniencing real humans due to lack of endpoint management/assessment? Just because a computer stands between me and your business doesn't mean your business can slack where classical models may not. I find that all the large tech companies have an appalling lack of technical support or assistance available, and not only that, but there's no way for a smart person to reach someone with a real notification. You have to tweet about it to get any fairness. Apple decides to become a bank. How about decide to become a customer facing organization first? /rant
I’m curious how much income really matters for credit decisions since I thought it was typically self-reported and not verified (although perhaps they also use third party non-verified sources to increase confidence), but to the extent it is a factor, I wouldn’t give much weight to living in a community property state either:
1. You can effectively opt out of community property rules via a pre-nup.
2. You can move states to a non-CP state.
Those are two very hard things to control against, so it makes sense to just discount them.
If that is what’s going on here, it’s not really gender-biased in my mind; especially in today’s world, an increasing number of women earn more than their male counterparts, so it’s just biased towards those with high income.
> I bet that Goldman Sachs is not correctly accounting for California community property laws, where 50% of the household income is hers.
Note that this is just in the absence of a prenuptial agreement that states a different allocation.
I've never been married, nor helped any married person apply for any kind of loan or credit, so have no idea what extra information is asked for if you check the "married" box instead of the "single" box on an application. Do the financial institutions ask married people about any prenups they are under?
Pre-nups in the US (maybe elsewhere too, but I am not knowledgeable enough to talk about other countries in this aspect) are not some magic wands that can stop the income earned by the household from being shared. Best use case for pre-nups is mostly protecting the assets you had before marriage. Of course, you can write whatever you want in it, including something like "in case of a divorce, person X gets all of the earnings they accumulated during the marriage, while person Y gets theirs, no 50/50 split", but it is guaranteed to be thrown away by courts during the divorce proceedings.
tl;dr: pre-nups won't allow you to avoid splitting your earnings made during the course of marriage 50/50.
I have just read through 3 out of 4 links you posted, and I cannot find anything that disproves my earlier point. Can you cite a specific part that you feel supports your point?
The closest I could think of was from LegalZoom (3rd link), and it says:
"In equitable distribution states, property acquired during the marriage belongs to the spouse who earned it. In a divorce, the property will be divided between the spouses in a fair and equitable manner. There is no set rule for determining who receives what or how much, but a variety of factors are considered. For example, the court may look to the relative earnings contribution of the spouses, the value of one spouse staying at home or raising the children, and the earning potential of each. Often, each spouse will receive one-third to two-thirds of the marital property."
But that doesn't seem to disprove my point, because it still has the "In a divorce, the property will be divided between the spouses in a fair and equitable manner". So even if technically yes, the property you individually acquired during the course of marriage is considered yours, it will still be divided between both spouses in a "fair and equitable manner".
It seems weird that firm like Goldman could miss the details of a California law. California has a bigger population than the 21 other states combined.
That's her point about diverse teams build better products. Bugs that affect the team building it are much more likely to be found and fixed than bugs that do not affect anyone in the builder's social circle.
We're all very smart here for considering justifications as too why this happened, but please don't get lost in the weeds. The financial calculation here was ridiculously broken, and it needs to be fixed at almost all costs. As folks who use algorithms as tools, we should use this story as a reason to be more accountable. It's tone deaf to publicly postulate a lack of sexist intention for the sake of women reading this. We should solely be exploring how to fix this massive error.
Maybe millionaires aren't the favored customers of credit card companies. They'd get a bit more on interchange fees than the average customers but they'd never run a balance and pay interest or late fees.
Completely gender-oblivious algorithms correlated to actual profitability-differences between the two distinct borrowers could've caused the discrepancy, leaving nothing at all "ridiculously broken".
That super-wealthy married partners don't always get the exact same credit line, when applying separately, is not particularly an issue of justice that "needs to be fixed".
(If their finances are truly completely merged, the simple fix is: whoever gets the bigger credit line asks for a 2nd card on their account – though I see Apple Card may not yet offer that traditional option.)
The mechanical assumption that married partners who apply separately must get the same credit line as each other is more patronizing, simple-minded, and disrepectful of their separate life-paths than the alternative.
If wife JHH were not just rich but also famously so (within a certain industry), while conversely husband DHH was an "extremely private person" whose stated profession for the last few years (or maybe decade?) was "homemaker", I'd not be surprised for her to get (& deserve!) a credit line that's 20x his. And there’d be nothing “ridiculously broken” about that, either.
Given community-property laws, yes, married partners more than a couple of years into their marriage ought to always get the same results. If not, then something is broken, because things are being taken into account that shouldn't be, by law.
Whether they practically share all finances on a daily basis or not, legally all of their property and income is shared.
> married partners more than a couple of years into their marriage ought to always get the same results
Credit card approvals are based on reported income to a large degree. It's likely that she put down a different number than him and/or the algorithm didn't properly account for their joint income.
Also, if they were to divorce, given she's not working and he's a famous partner at a profitable company, his earning power is significantly higher than hers. It's not unreasonable to set his limit higher based on that.
That's not sufficient, unless community-property laws also perfectly predict both:
(1) likelihood of repayment & ease of collection; and
(2) net-profitability of a customer, after all fees, charge-reversals, support-costs, etc are considered.
You're under the hypnotic sway of a legal fiction – "community property" – that paints an attractive, childlike simplicity over the situation that doesn't exist in reality.
On the other hand, credit companies have real data about repayment rates, collection yield, & lifetime profitability.
And even with the strongest possible community property laws, in the event of disputes (& divorces!), there's no magic wand a creditor can wave to costlessly collect from DHH if JHH defaults, or vice-versa.
Even if the creditor is completely in-the-right, essentially, all they can do is send threatening letters & make adverse credit reports – which will have differential effectiveness against individuals despite any legal fiat claiming they're the same financial unit.
And whether DHH would be on the hook for JHH's debts is murky & jurisdiction-dependent. Only some US states use "community property" rules to make spouses jointly responsible for each others' debt – and notably, Illinois (where DHH's partnership/CTO position in 37signals is based) isn't one of them. (See: <https://www.nolo.com/legal-encyclopedia/am-i-liable-my-spous...)
So what's the legal or statistical basis for your implication that "community property" makes them identical credit- and profitability- risks? (Has the couple even revealed that they're in a "community property" jurisdiction?)
This paints a great picture, thank you. You're not arguing whether she deserves it, but whether the law actually ties the two together properly. My point here however was to not only fix the algorithms but the laws too. Idealistic, I know.
But what ideals require that spouses' assets & liabilities be even more strongly linked, and any offered credit always identical, and that it should be even easier for creditors to pursue people for their spouses' debts?
(As you've not described these ideals, just asserted them, is the rationale religious in nature?)
The fact that jurisdictions as varied as California, Illinois, most other US states, Spain, & much of the EU don't yet do that is evidence to me that any "idealistic" case driving such a determination mustn't be too widely held.
And many couples prefer the separate treatment that's allowed them by current law. (Can you imagine the rage from a DHH-type, in mirror-universe where he and his wife had worked to maintain separate finances, but the law clipped his credit based on her income & repayment history? It'd be pretty similar!)
Finally, even if the laws of every jurisdiction, from towns to the UN, required such a link – it still wouldn't be identically costly/profitable to lend to each spouse of a pair, or equivalently costly/easy to collect in difficult situations.
Married partners don't become identical in earning power, conscientiousness, concern about their credit-rating, profitability, etc by either the act of marriage or any legislative fiat by people who wish that might be so.
In matters of the heart, sure, defer to romantics who see marriage as a total, unquestionable unification of two people into one eternal unit. In actuarial matters of lifetime probability and credit risk, defer to the people who have the real data & money on-the-line: creditors.
I've always assumed the origins of marriage were religious in nature and that the only legal purpose was to join finances. But after doing a google search to not look like an idiot, it seems like marriage was started to legitimize children and that religion didn't officially get involved in marriage til around the 8th century. So I do have to say, my "idealistic" side is falling apart at this point. I still have my concerns about her having 5% of the credit limit of her husband based on current laws. The difference in credit limit here is so great that it seems at first glance to be a general calculation mistake. I don't think we can safely take sexism off the table (if their lives were swapped but not their gender, would DHH have received a credit limit that was 5% of his wife's?). Unfortunately, it sounds like we'll never know what variables were at play here and how they impacted the calculation.
The crux of this issue is that Apple Card is only for individuals
>“As with any other individual credit card, your application is evaluated independently,” Williams said in a statement. “We look at an individual’s income and an individual’s creditworthiness, which includes factors like personal credit scores, how much personal debt you have, and how that debt has been managed.”
That setup makes it “possible for two family members to receive significantly different credit decisions,” Williams said. He added that the bank is actively exploring ways to allow users to share their Apple Card with family members.
Goldman was aware of the potential issue before it rolled out the Apple Card in August, but it opted to go with individual accounts because of the added complexity of dealing with co-signers or other forms of shared accounts, according to a person with knowledge of the matter.
She mentions she has better and longer credit history than David, and since they share all financial accounts, she likely reported her income as their joint income. I sincerely doubt they'd be raising this much fuss if she'd put $0 in the income box and was surprised.
Her point is that all those factors showed her as more credit worthy with the exception of 'Homemaker' and 'Female'.
That's not exactly the way credit works. For one, you're supposed to report your income not joint income when filling out applications. You can however list joint assets, or at least that's much fuzzier.
EDIT: Apparently, you can use household income. However, if two parties are applying for a credit card and trying to secure both lines with the same assets, you should expect one to be lower (if not outright declined).
In this case, there's a FICO score to consider, but there's also other components. For instance, you can be declined from receiving a credit card if you are a "churner", you can also be declined for not making the bank enough money (aka you never miss payments and the card has no fees).
Finally and importantly, the bank has the right to set what ever bar it wants for your credit limit. If they feel giving you a high limit will make them money, they do it, if they don't, they wont ("they" here being encoded in an algorithm). They can even legally assign credit at random, provided they have controls to ensure the bank doesn't go bankrupt.
In this case, the only thing the bank has to prove is that they didn't use signals illegally (usually this includes adding protected class information, e.g. gender, among others). Many banks don't even keep a record of that information and definitely not as part of the algorithm. I suspect Goldman wouldn't be that stupid.
Meaning - what ever the reason for the discrepancy, gender was very very likely not the culprit here.
Before 2013, the CARD Act required that card issuers take into account only the individual card applicant’s income or assets. However, the Consumer Financial Protection Bureau put in an amendment in 2013 that allowed card issuers to also consider any third-party income and assets that an individual card applicant of at least 21 years of age has a “reasonable expectation of access to.”
That's fair, some places are explicit and still lay out "individual", but legally you don't have to.
In either case, I'd also note here, because they are a married couple they are counting that income twice. Which, for offering credit is a bit dangerous as this income "secures" the credit line. I'm more surprised it didn't result in an outright decline if they didn't seperate their income when applying.
> For one, you're supposed to report your income not joint income when filling out applications.
Not necessarily, American Express promotes using household income on their card applications.
Edit: From an American Express application:
"Include all income available to you. If under age 21, include only your own income. Income includes wages, retirement income, investments, rental properties, etc."
> she likely reported her income as their joint income
It seems incredibly unlikely that she reported a substantial joint income and received a $57 credit limit. Doesn't it seem far more likely that he reported their joint income and she reported $0 income?!
I don’t equate hypothesizing the contents of a black box algorithm as conspiracy theory mongery. We’re looking at all possible inputs.
To add some anecdata, my relative in the financial software business has been telling me for years that they’ve been looking at mining public social media data to enhance risk management. I wouldn’t put it past Goldman (and their recent push into public banking) to consider this.
And the Chinese have just come right out and done it.
If Goldman is doing this, they would have been caught by now with real data, not some entitled asshole whining about how his wife isn't getting the same exact credit limit as he is. Someone would be citing apples-to-apples comparisons by now. Yet nobody has. Hmm. Gee. I wonder why?
I have no idea how this works, but wouldn't they look at past income history to handle the possibility of divorce and being able to repay the debt? Or is that illegal?
This would make sense to me, as DHH probably has an insanely high individual income history.
From what I understand of the original post, they filed taxes jointly and his wife had a higher credit score (though there isn't information about the credit history details, just on the overall score). This would suggest she should have received about the same credit amount, or more. To be fair, it isn't clear if she had more personal debt, which based on the official response, might have caused the difference in credit limits.
Seriously. My credit score took a hit after I paid off my car and my credit cards. The justification I was given is that they penalize you if you don’t have some kind of ongoing debt you’re servicing.
Going off on Twitter seems to be a good way to get attention but how does an average Joe get help for company "policies"?
We need a service where companies and their customer service, their policies and their practices are openly shamed, feedback gathered and voices heard without the need for twitter (and being famous enough on it to get world-wide attention). An issue board of the sorts that companies can go to and see what their users are complaining about and it needs to be an independent service.
Corporations have no one person that can answer, enforce, amend or question their protocol, so this whole thing becomes an attention contest on twitter or HN or whatever. It is an uphill battle until that "one person", may be a CEO or VP gets informed about it from their PR team. They don't have infinite bandwidth to listen to hundreds of complaints and many important ones go amiss. Even if the leaders of the corporations have aligned intentions.
I can't remember how many times I've seen major issues about a product or service raised through twitter and posted on HN . This needs to change somehow since we are only hearing the top 0.01% of people who took the chance to speak up on Twitter.
> We need a service where companies and their customer service, their policies and their practices are openly shamed, feedback gathered and voices heard without the need for twitter (and being famous enough on it to get world-wide attention).
I mean, the point of government agencies and watchdogs are to do just this. However, most of them have been bought off or had their teeth pulled by greedy capitalists.
We don't need a "service" - you're just putting more privilege behind a paywall. We need to hold legislators and regulators accountable to do their damn job.
Nowhere did I mention that we need a paid or free service.
The argument at hand is that we need something other than Twitter to blast out a message that most likely won't be heard. A service (again using the most general definition) that collects feedback from their users and provides feedback to large corporations that can look up the most pressing issues.
Today, that's through feedback surveys, support forums, BBB/Consumer Reports and worse, Twitter.
Could you provide a more substantive argument? I like the idea if a government had a website service that collects feedback enmasse and shows the top voted concern for that corporation. Whether what gets done and the "enforcement" part is I think an entirely different problem and that's what laws are for. I am simply talking about fragmentation of ways public complians to corporations. A .gov website where public can vote, would be amazing. And truly funded by public through Taxes.
Wonderful post, so glad that Mrs. Hansson stepped out for this cause, as uncomfortable as it was.
I've personally wondered at credit scores + credit offered for a long time. The amount of regulation provides wonderful cover for these big institutions to really do whatever they want and lean back on "algorithms" / that they can't override the policy due to regulations, etc. Meanwhile, it's becoming more clear that the algorithms have encoded biases. In addition, the whole credit system is not one of those things you can opt out of (if you ever want a home or car, which I get, not everyone needs), and the whole thing is based on previous borrowing, so responsible (using cash) or underprivileged (never had the chance to start the credit bootstrap) folks are extremely disadvantaged.
The interesting twist in this story that I find damning is that Apple actually overrode the algorithm due to pressure -- this pokes a HUGE hole in this "cover" these institutions have created to date.
There are a lot of comments here pointing out that this was probably triggered by her presumably having no/little income in comparison to her husband, despite a higher credit score... and that this is a mistake on Goldman Sachs' part not taking into account their married status and income sharing.
BUT every credit card I've ever applied for has asked for my self-reported income. Because there's no official source a credit card can use to verify your income. Unlike debts, your income doesn't get reported to any agency. It's private.
And since 2013, a "homemaker" can and should put down their entire household income, not their individual income:
> The Credit Card Act of 2009 requires credit card companies to take “the ability of the consumer to make the required payments” into account when deciding whether to approve an application... A 2013 amendment to the federal regulations surrounding the Card Act expanded the definition of one’s ability to pay so that people 21 and older can include any income to which they have a “reasonable expectation of access.” This can include income from a spouse, partner or other member of your household. [1]
So I'm guessing perhaps she simply made the mistake of not reporting entire household income?
No -- that's my point -- there are no third party data sources on income, as far as I'm aware.
This is why credit cards ask your income in the first place, and why landlords will often ask for a copy of your tax return from last year as proof, along with your last two pay stubs. Because there's nowhere else for them to get it or verify it, certainly not at scale.
Technically background reports and credit screenings can include income (when doing them for Apartment rentals, it usually comes back with their income) but it's almost always off by some and also can come back with a big "UNKNOWN".
Kind of tangential, but wow, great writing. I knew nothing about the issue before this post, but her text was honest, smart, self-aware, generous, emotional, inspiring, informative, all that and still very concise (something that I value a lot). I will bookmark this for future reference to great writing about uncomfortable situations.
> I had a career and was successful prior to meeting David, and while I am now a mother of three children — a “homemaker” is what I am forced to call myself on tax returns — I am still a millionaire who contributes greatly to my household and pays off credit in full each month.
Could it be as simple as being listed as a “Homemaker” vs direct employment? Seems plausible to me.
> But AppleCard representatives did not want to hear any of this. I was given no explanation. No way to make my case.
I’d be impressed if you could get anyone on the phone at any financial institution to explain the proprietary inputs to any calculation. Not only would they not have access to that information, it’d be so off-script to reveal it that there’s no flow chart of comments you could make to get it.
> I’d be impressed if you could get anyone on the phone at any financial institution to explain the proprietary inputs to any calculation. Not only would they not have access to that information, it’d be so off-script to reveal it that there’s no flow chart of comments you could make to get it.
As blackbox machine learning algorithms are increasingly being used for these proprietary calculations, people are starting to realize that this sort of explanation probably does need to be made available.
It's okay to use machine learning to do real-time image segmentation that can't be done robustly any other way.
But it probably shouldn't be okay to use machine learning as "bias laundering" for business decisions that legally should be challengeable. You shouldn't be allowed to hide behind "it's not me, it's the algorithm!" if your algorithm is causing you to look like you're discriminating against a protected class.
This isn’t really about Apple Card, or Goldman Sachs, or even credit reporting. It’s broader than sexism.
This is about ”the algorithm.” Not this specific algorithm, but the black boxes that we increasingly entrust with positions of power over our lives, to the point where the best anyone can do is throw their hands up and say “it’s the algorithm!”
Algorithms are just business logic and math. They can be very complicated instances of both, but they’re not magic. Humans are capable of explaining them and understanding them. But it takes investment to communicate these things, and until now, there’s been no motivation in the industry to invest in algorithmic transparency.
Is it legal to model that type of dynamic into credit decisions? I've never heard of being able to use that type of social dynamic into a very regulated financial product.
Even if it is not legal to do so explicitly, there are probably a number of things which correlate with it that it is legal to use, which end up with the same effect. And indirect (disparate impact) discrimination without intent to discriminate on a prohibited basis isn't generally illegal in the US except in employment law.
40%. That's 60% who don't, which is why the banks might assume that the men are primarily the breadwinners. Also, banks are slow to change, and I'm positive those numbers have shifted fairly recently.
Came here to post the same. If my wife had an income and I didn’t then I would expect her to be offered a higher limit even if everything goes into household income. Does the US credit system not differentiate between household and individual income?
They have covered this numerous times. In certain states, there seems to be something called community property law (which I found out about in this debacle) that states that a married couple who file jointly, their incomes are considered the same.
So at best, this is shitty software design that doesn't take your state's community property laws into account, and at worst, it is a pretty clear cut case of sexism of the "algorithm."
Update: corrected name of law based on child posts.
Even with community property, surely the occupation is a valid signal? If I was a teacher and my wife was a CEO again I think it’s reasonable she is offered more credit even in a community property state. She would simply have more earning power than me.
I can picture all kinds of bugs coming from the fact A+B=A under this setup.. might be one of the reasons the software can’t handle it, if it’s something specific to a couple states’ tax systems. Income being represented twice (instead of split equally) seems quite unexpected. Would make more sense to consider both a single “individual”? Recipe for disaster either way.
The fact that nobody can actually explain in a mathematical and deterministic way to a card holder exactly what goes into their creditworthiness is very problematic, given that creditworthiness is a foundational aspect of our economy.
And in many cases, these algorithms are deemed proprietary. So there's no accountability AT ALL. How we were somehow ok with trusting 3 private for-profit companies with this much power always baffles me.
Can you show a screen shot of the page that the “what does annual income include?” text links to? Otherwise we are just debating the semantics of the word annual in this context.
Wait, so she has left the workforce, and is a millionaire as far as savings, and pays all her debts onetime. Of course she's going to have a lower credit line! She's the worst type of person to loan money to: people who pay everything on time. She's going to cost the bank money and gain them very little.
Not much is said about her husband, but if he carries any balances or has had a shorter credit history or has any higher risk, aren't the algorithms going to be weighted to give people like him way more credit? That's who they need to make money off of.
There are a lot of unknowns here and we're guessing at a lot of information. Many of these algorithms are also closed so we can't be sure how they're weighting things. We can guess, reasonably, that they're weighted toward making the most money for the banks as possible. They probably balance risks with ability to pay back on those risks.
Jumping to conclusions like, "the algorithms are sexist," is way too overly simplistic. It could be that the major weight was gender in this case, but if that's true, it's probably because the number crunching revealed men in her husband's demographic were most likely to be unable to pay off all purchases at once and earn them more money via interest. More likely, it's way way more complicated than that.
I took a look at the husband twitter, and I want to correct a big misconception, the credit score we can look at isn't actually the same one the bank used and it can be actually quite different. We actually can't access the credit score they get.
I don't remember the news article, but if really needed, I'll try to find it back. They tried it over 4 individuals, and one of them was much lower than they thought it would be considering what he was saying. They tried to find out why, which is how they learned they weren't the actual ones the bank was using and thus tried to get hold of the actual score. The one from the bank was nearly 200 points better. I don't remember if they tried with the others 3.
I don't believe this was actually her issue, the most simple explanation is that the income they got wasn't high enough... which has nothing to do with her gender.
I agree that credit shouldn't be a black box, that they should be able to say, well your credit score allow you this interest rate, but your income only allow you this credit amount... I know it's a black box to avoid abuse, but that's only security by obscurity.
The reason it has to do with Apple Card is because it's being marketed as something that rejects the status quo. Their headline literally says "Created by Apple, not a bank" – and promises more transparency.
I do agree that this is probably not exclusive to Apple Card. This does expose larger issues with our financial and credit system. And another really bad thing that I've had an issue with for years, that people hide behind machines. "Oh the system doesn't allow me" or "The algorithm said so" – which is a VERY troubling trend that removes humans from any kind of equation.
DHH (& now JHH) haven't made a strong prima facie case that any gender discrimination is involved.
Credit lines, especially those from differentiated lenders (which Apple & Goldman Sachs definitely aspire to be) will not be a simple function of easily-observed factors like income, assets and debts – nor even the credit reports and simple FICO-style "credit scores" of the oligopoly bureaus.
Those will be inputs, sure, but lots of other behavioral history could be included – anything Apple & GS can get their hands on, really – and the lender will be estimating not just "ability to repay" but "expected net lifetime profitability across all services", as a function of the granting of particular credit lines. Has DHH spent more with Apple over the past 20 years? That could do it.
As JHH notes, she is "an extremely private person". That right there is also sufficient to explain a 5x, 10x, or 100x difference in credit-granted. Along with, say, her partner being publicly-known for buying custom supercars and Italian vacation homes to park them in.†
I've been reading the original thread on Twitter, and saw a few other testimonies of couples experiencing the same thing. However, what I don't recall seeing (and maybe I missed it cause Twitter is hard to follow) is an example of a wife signing up before her husband. All the cases were about the husband signing up first, and then his wife. I wonder if this could be part of the explanation. Like the 2nd card opened for the same household is seen as a little more risky or something.
Imagine having a brand so strong that rich people write think pieces framed in terms of social justice when they have trouble getting access to your credit products.
For any other credit card a wealthy couple in their shoes would have written off the credit card company as idiots and applied elsewhere.
I have no issue with the argument against unauditable credit offerings that disproportionately affect protected classes. I’m pointing out that but for the brand being Apple you’d never have heard this story.
Or perhaps they are leveraging Apple's known sensitivity to this kind of publicity to highlight a serious issue with equity of credit access and opaque algorithms running our lives.
There is already a law that allows you mention any source of income in your household for an credit card application. Is Apple in violation of that law?
I'm all for improving transparency in the credit system. Woz had/has the same situation as DHH apparently. My suspicion is that the algorithm takes into consideration factors such as "are you the co-founder of a highly successful software business", or maybe it's something like income or total assets you're legally responsible for or some such.
That said I do wish Apple and all other companies issuing credit would just come out and say exactly how the algorithm works. Would make a lot of these conversations easier and less annoying for everyone.
What is more likely, an algorithm (or their statistical model) rewarding an illusive category occupied by 0.0001% of the population or penalizing the fact that she is a 'homemaker' and a woman, categories occupied by millions and millions of people?
As someone who worked on these models in the consumer credit industry, it is possible they there isn't any discrimination. The only thing that comes to mind is recent inquiries, which have a minimal effect on credit score but are highly predictive of default. If she applied for a few credit cards in the previous half year and DHH did not, it would explain the difference without being discriminatory.
Much more likely, in my view, is that the algorithm looking at something that is so highly correlated with being female (e.g., Homemaker as career) and default. This would almost surely fail existing regulatory tests against discrimination. Since most credit applications ask for household income, ...etc. It is doubtful their applications otherwise looked meaningfully different.
Edit: Checked the application, and you are indeed required to enter in household income and not your individual income if you share a checking account.
> The only thing that comes to mind is recent inquiries, which have a minimal effect on credit score but are highly predictive of default. If she applied for a few credit cards in the previous half year and DHH did not, it would explain the difference without being discriminatory.
That would have been visible over the credit score they got afterward.
I'm curious to see a collection of reports of cases like this (large difference in credit limits of two persons who are married / have same financial situation).
I've read about two husband-wife pairs so far. What else has been reported? What about single women vs single men who both have similar financial history?
Is there a "share my salary" spreasdsheet but for Apple Card limits.
Also, I doubt the new Apple Card is totally unique, and this isn't financer Goldman Sachs's first foray into the personal credit busines. Has anyone encountered this issue with older credit cards and loan products?
GDPR has given some good thought to automated decision making and has guidance that I think all companies should follow even if they don’t need to follow the GDPR regulations.
“We regularly check our systems for accuracy and bias and feed any changes back into the design process.
As a model of best practice...
We have signed up to [standard] a set of ethical principles to build trust with our customers. This is available on our website and on paper.” [1]
> It’s why I was deeply annoyed to be told by AppleCard representatives, “It’s just the algorithm,” and “It’s just your credit score.
Surely AppleCard reps. have deeper understanding into what caused the $57 credit limit, no? Either they do use a blackbox DLNN, their algorithm literally doesn't have any non-boolean output or other logging (unlikely), or the author here is omitting the explicit reason for the denial.
In the Apply Card Privacy Policy it states that part of the algorithm for credit worthiness is that it checks your Apple ID for Apple purchases. There is a good chance that DHH makes all the Apple purchases in his family and thus he received a higher limit. https://thetapedrive.com/apple-card-onboarding
Her post was well thought out and her points make sense. His tweet had the right intention, but when people tried to offer an explanation (not justification) as to why this happen, he said they were “mansplaing”. Don’t understand why he needed to go that route.
They offered to solve the problem just for her because it was attracting bad PR, and she turned them down because they need to solve it for everybody. She didn’t have to do that. She’s choosing to stand in line even though they already waved her through the velvet rope.
She’s leveraging privilege here, but in the opposite direction from you seem to think she is.
Did you read what she wrote? She literally calls out as ridiculous the fact that she's able to get her problem fixed by virtue of who she is while countless others have no recourse. It is the opposite of tone deaf.
I like DHH for the most part, but sometimes on stuff like this, seems so out of touch. This isn't an Apple-created problem - credit scores/etc. are deeply flawed. We all know this. Throwing Apple under the bus is silly, but a big target I guess, and a juicier headline. Nothing to see here. If you really want change, go after the credit bureaus and banks backing the cards themselves.
Claiming an algorithm/process is sexist without specific evidence is also problematic, along with claiming this is a "justice for all issue"-- how exactly is anyone/any corporation required to loan you money in any capacity?
Side note - are folks trying to make this like a bigger debate about "algorithms" and "machine learning" in general? They do realize they're different things, right? We're not that dumb as a society--- I hope?
Credit access is definitely a "justice for all" issue. Access to credit is a huge determinant of social success, and there are entire government departments dedicated to making it more equitable.
In both the interview and the post linked here, there is an explanation of why opaque algorithms are a problem. So yes, this is about algorithms (if not specifically machine learning).
Generic non-sequiturs like your last sentence don't really add to the discussion.
There is nothing new here, though. Apple isn't the one using "the algorithm", the banks are. I watched the interview, and it didn't address why they're targeting Apple other than it's the "Apple card" they're having problems with. I'm fine with fighting the good fight about transparency on credit factors, etc - so do that! DHH knows what he's doing poking the new shiny target on the block that's super recognizable and hot.
Agree to disagree on the credit issue - I don't believe any corporation or establishment owes anyone any loan/credit-line they don't feel like backing. There's an indeterminate amount of lenders out there though, I guarantee at least one of them will cut you a deal, but on their terms. If you're talking about government lines of credit/loans, that's an entirely different matter.
Yes, he's poking a shiny target, because that's the only way things will change. Just to repeat what is said in the interview: they are targeting Apple because they otherwise respect Apple as a company and consider it responsible for the decisions behind its card.
One thing that seems to be in common of the people complaining about sexism regarding their wives' credit application, is that the men are apply and receiving credit cards first, and then the wife.
Why are the men always applying first for the credit cards?
Maybe it is not so much sexism as the order in which they are applying? Then again, maybe it is subtle sexism on the part of the men by always having to be the first one to try something new?
- Not “technically” using gender as a signal
- Ends up practically causing gender-based unfairness
- Is simply bad data and I would consider a bug
- Is something customer service is not going to be helpful with
I’m glad this issue came to light. I hope it leads to productive conversations about black-box algorithms and underwriting.