
About the Apple Card - aarestad
https://dhh.dk/2019/about-the-apple-card.html
======
teej
I suspected the root cause of this situation was due to the income signal, not
credit score. I bet that Goldman Sachs is not correctly accounting for
California community property laws, where 50% of the household income is hers.
This seems like the exact type of oversight that is:

\- Not “technically” using gender as a signal

\- Ends up practically causing gender-based unfairness

\- Is simply bad data and I would consider a bug

\- Is something customer service is not going to be helpful with

I’m glad this issue came to light. I hope it leads to productive conversations
about black-box algorithms and underwriting.

~~~
silencio
I'm pretty sure household income is a federal thing coming from the Credit
Card Act of 2009, not just CA community property laws. Every time I apply for
a credit card it's been pretty clear they want household and nonwage income,
not a solo salary.

Either way, it's baffling and I think this explains why my wife got such a
piss poor credit limit on her Apple Card, something like 1/5 of another line
of credit she has. I didn't even bother applying after seeing what she got.

The joke...? is on us though because we're both female.

~~~
teej
I assume they pull income from a third party source to supplement the self-
reported income. The third party income would be the source of the
discrepancy.

------
shanemlk
We're all very smart here for considering justifications as too why this
happened, but please don't get lost in the weeds. The financial calculation
here was ridiculously broken, and it needs to be fixed at almost all costs. As
folks who use algorithms as tools, we should use this story as a reason to be
more accountable. It's tone deaf to publicly postulate a lack of sexist
intention for the sake of women reading this. We should solely be exploring
how to fix this massive error.

~~~
jimmaswell
Maybe millionaires aren't the favored customers of credit card companies.
They'd get a bit more on interchange fees than the average customers but
they'd never run a balance and pay interest or late fees.

~~~
shanemlk
True. But the point of the article was hoping it doesn't happen to regular
folk.

------
Despegar
The crux of this issue is that Apple Card is only for individuals

>“As with any other individual credit card, your application is evaluated
independently,” Williams said in a statement. “We look at an individual’s
income and an individual’s creditworthiness, which includes factors like
personal credit scores, how much personal debt you have, and how that debt has
been managed.”

That setup makes it “possible for two family members to receive significantly
different credit decisions,” Williams said. He added that the bank is actively
exploring ways to allow users to share their Apple Card with family members.

Goldman was aware of the potential issue before it rolled out the Apple Card
in August, but it opted to go with individual accounts because of the added
complexity of dealing with co-signers or other forms of shared accounts,
according to a person with knowledge of the matter.

[https://www.cnbc.com/2019/11/11/goldman-wants-to-fix-the-
app...](https://www.cnbc.com/2019/11/11/goldman-wants-to-fix-the-apple-card-
flaw-that-has-users-claiming-bias.html)

~~~
mmanfrin
She mentions she has better and longer credit history than David, and since
they share all financial accounts, she likely reported her income as their
joint income. I sincerely doubt they'd be raising this much fuss if she'd put
$0 in the income box and was surprised.

Her point is that all those factors showed her as more credit worthy with the
exception of 'Homemaker' and 'Female'.

~~~
citilife
That's not exactly the way credit works. For one, you're supposed to report
_your income_ not _joint income_ when filling out applications. You can
however list _joint assets_ , or at least that's much fuzzier.

EDIT: Apparently, you can use household income. However, if two parties are
applying for a credit card and trying to secure both lines with the same
assets, you should expect one to be lower (if not outright declined).

In this case, there's a FICO score to consider, but there's also other
components. For instance, you can be declined from receiving a credit card if
you are a "churner", you can also be declined for not making the bank enough
money (aka you never miss payments and the card has no fees).

Finally and importantly, the bank has the right to set what ever bar it wants
for your credit limit. If they feel giving you a high limit will make them
money, they do it, if they don't, they wont ("they" here being encoded in an
algorithm). They can even legally assign credit at random, provided they have
controls to ensure the bank doesn't go bankrupt.

In this case, the only thing the bank has to prove is that they didn't use
signals illegally (usually this includes adding protected class information,
e.g. gender, among others). Many banks don't even keep a record of that
information and definitely not as part of the algorithm. I suspect Goldman
wouldn't be that stupid.

Meaning - what ever the reason for the discrepancy, gender was very very
likely not the culprit here.

~~~
cjbprime
> _your income_

This is no longer true.

==

Before 2013, the CARD Act required that card issuers take into account only
the individual card applicant’s income or assets. However, the Consumer
Financial Protection Bureau put in an amendment in 2013 that allowed card
issuers to also consider any third-party income and assets that an individual
card applicant of at least 21 years of age has a “reasonable expectation of
access to.”

~~~
citilife
That's fair, some places are explicit and still lay out "individual", but
legally you don't have to.

In either case, I'd also note here, because they are a married couple they are
counting that income twice. Which, for offering credit is a bit dangerous as
this income "secures" the credit line. I'm more surprised it didn't result in
an outright decline if they didn't seperate their income when applying.

------
spectramax
Going off on Twitter seems to be a good way to get attention but how does an
average Joe get help for company "policies"?

We need a service where companies and their customer service, their policies
and their practices are openly shamed, feedback gathered and voices heard
without the need for twitter (and being famous enough on it to get world-wide
attention). An issue board of the sorts that companies can go to and see what
their users are complaining about and it needs to be an independent service.

Corporations have no one person that can answer, enforce, amend or question
their protocol, so this whole thing becomes an attention contest on twitter or
HN or whatever. It is an uphill battle until that "one person", may be a CEO
or VP gets informed about it from their PR team. They don't have infinite
bandwidth to listen to hundreds of complaints and many important ones go
amiss. Even if the leaders of the corporations have aligned intentions.

I can't remember how many times I've seen major issues about a product or
service raised through twitter and posted on HN . This needs to change somehow
since we are only hearing the top 0.01% of people who took the chance to speak
up on Twitter.

~~~
dvdhnt
> We need a service where companies and their customer service, their policies
> and their practices are openly shamed, feedback gathered and voices heard
> without the need for twitter (and being famous enough on it to get world-
> wide attention).

I mean, the point of government agencies and watchdogs are to do just this.
However, most of them have been bought off or had their teeth pulled by greedy
capitalists.

We don't need a "service" \- you're just putting more privilege behind a
paywall. We need to hold legislators and regulators accountable to do their
damn job.

~~~
spectramax
So we do need service because clearly the path you're suggesting isn't
working, no?

~~~
dvdhnt
A public service? Sure.

A paid service? No.

~~~
spectramax
Nowhere did I mention that we need a paid or free service.

The argument at hand is that we need something other than Twitter to blast out
a message that most likely won't be heard. A service (again using the most
general definition) that collects feedback from their users and provides
feedback to large corporations that can look up the most pressing issues.

Today, that's through feedback surveys, support forums, BBB/Consumer Reports
and worse, Twitter.

Could you provide a more substantive argument? I like the idea if a government
had a website service that collects feedback enmasse and shows the top voted
concern for that corporation. Whether what gets done and the "enforcement"
part is I think an entirely different problem and that's what laws are for. I
am simply talking about fragmentation of ways public complians to
corporations. A .gov website where public can vote, would be amazing. And
truly funded by public through Taxes.

------
brokentone
Wonderful post, so glad that Mrs. Hansson stepped out for this cause, as
uncomfortable as it was.

I've personally wondered at credit scores + credit offered for a long time.
The amount of regulation provides wonderful cover for these big institutions
to really do whatever they want and lean back on "algorithms" / that they
can't override the policy due to regulations, etc. Meanwhile, it's becoming
more clear that the algorithms have encoded biases. In addition, the whole
credit system is not one of those things you can opt out of (if you ever want
a home or car, which I get, not everyone needs), and the whole thing is based
on previous borrowing, so responsible (using cash) or underprivileged (never
had the chance to start the credit bootstrap) folks are extremely
disadvantaged.

The interesting twist in this story that I find damning is that Apple actually
overrode the algorithm due to pressure -- this pokes a HUGE hole in this
"cover" these institutions have created to date.

------
crazygringo
There are a lot of comments here pointing out that this was probably triggered
by her presumably having no/little income in comparison to her husband,
despite a higher credit score... and that this is a mistake on _Goldman Sachs
' part_ not taking into account their married status and income sharing.

BUT every credit card I've ever applied for has asked for my _self-reported
income_. Because there's _no official source_ a credit card can use to verify
your income. Unlike debts, your income doesn't get reported to any agency.
It's private.

And since 2013, a "homemaker" can and should put down their _entire household
income_ , not their individual income:

> The Credit Card Act of 2009 requires credit card companies to take “the
> ability of the consumer to make the required payments” into account when
> deciding whether to approve an application... A 2013 amendment to the
> federal regulations surrounding the Card Act expanded the definition of
> one’s ability to pay so that people 21 and older _can include any income to
> which they have a “reasonable expectation of access.” This can include
> income from a spouse, partner or other member of your household._ [1]

So I'm guessing perhaps she simply made the mistake of not reporting entire
household income?

[1] [https://www.nerdwallet.com/blog/credit-cards/list-spouses-
in...](https://www.nerdwallet.com/blog/credit-cards/list-spouses-income-
applying-credit-card/)

~~~
teej
You’ve assumed JHH made a mistake but missed that Goldman is probably using a
third party data source to verify the self-reported income figure.

~~~
crazygringo
No -- that's my point -- there _are no_ third party data sources on income, as
far as I'm aware.

This is why credit cards ask your income in the first place, and why landlords
will often ask for a copy of your tax return from last year as proof, along
with your last two pay stubs. Because there's nowhere else for them to get it
or verify it, certainly not at scale.

~~~
judge2020
Technically background reports and credit screenings _can_ include income
(when doing them for Apartment rentals, it usually comes back with their
income) but it's almost always off by some and also can come back with a big
"UNKNOWN".

------
soneca
Kind of tangential, but wow, great writing. I knew nothing about the issue
before this post, but her text was honest, smart, self-aware, generous,
emotional, inspiring, informative, all that and still very concise (something
that I value a lot). I will bookmark this for future reference to great
writing about uncomfortable situations.

------
koolba
> I had a career and was successful prior to meeting David, and while I am now
> a mother of three children — a “homemaker” is what I am forced to call
> myself on tax returns — I am still a millionaire who contributes greatly to
> my household and pays off credit in full each month.

Could it be as simple as being listed as a “Homemaker” vs direct employment?
Seems plausible to me.

> But AppleCard representatives did not want to hear any of this. I was given
> no explanation. No way to make my case.

I’d be impressed if you could get anyone on the phone at any financial
institution to explain the proprietary inputs to any calculation. Not only
would they not have access to that information, it’d be so off-script to
reveal it that there’s no flow chart of comments you could make to get it.

~~~
ohazi
> I’d be impressed if you could get anyone on the phone at any financial
> institution to explain the proprietary inputs to any calculation. Not only
> would they not have access to that information, it’d be so off-script to
> reveal it that there’s no flow chart of comments you could make to get it.

As blackbox machine learning algorithms are increasingly being used for these
proprietary calculations, people are starting to realize that this sort of
explanation probably does need to be made available.

It's okay to use machine learning to do real-time image segmentation that
can't be done robustly any other way.

But it probably shouldn't be okay to use machine learning as "bias laundering"
for business decisions that legally should be challengeable. You shouldn't be
allowed to hide behind "it's not me, it's the algorithm!" if your algorithm is
causing you to look like you're discriminating against a protected class.

------
mortenjorck
This isn’t really about Apple Card, or Goldman Sachs, or even credit
reporting. It’s broader than sexism.

This is about _”the algorithm.”_ Not this specific algorithm, but the black
boxes that we increasingly entrust with positions of power over our lives, to
the point where the best anyone can do is throw their hands up and say “it’s
the algorithm!”

Algorithms are just business logic and math. They can be very complicated
instances of both, but they’re not magic. Humans are capable of explaining
them and understanding them. But it takes investment to communicate these
things, and until now, there’s been no motivation in the industry to invest in
algorithmic transparency.

Maybe it’s time to create that motivation.

------
wrkronmiller
I believe Apple/Goldmann ask for annual income as part of the application.
Perhaps that was the reason if she had no source of income?

~~~
ThrustVectoring
This shouldn't matter for a married household with merged finances - they
count whatever income the respondent can rely on to pay their bills.

~~~
rpmisms
Except that statistically, men more often have control of finances. This
debacle is regressive, but very likely based on market research.

~~~
TuringNYC
Is it legal to model that type of dynamic into credit decisions? I've never
heard of being able to use that type of social dynamic into a very regulated
financial product.

~~~
dragonwriter
Even if it is not legal to do so explicitly, there are probably a number of
things which correlate with it that it is legal to use, which end up with the
same effect. And indirect (disparate impact) discrimination without intent to
discriminate on a prohibited basis isn't generally illegal in the US except in
employment law.

------
djsumdog
Wait, so she has left the workforce, and is a millionaire as far as savings,
and pays all her debts onetime. Of course she's going to have a lower credit
line! She's the worst type of person to loan money to: people who pay
everything on time. She's going to cost the bank money and gain them very
little.

Not much is said about her husband, but if he carries any balances or has had
a shorter credit history or has any higher risk, aren't the algorithms going
to be weighted to give people like him way more credit? That's who they need
to make money off of.

There are a lot of unknowns here and we're guessing at a lot of information.
Many of these algorithms are also closed so we can't be sure how they're
weighting things. We can guess, reasonably, that they're weighted toward
making the most money for the banks as possible. They probably balance risks
with ability to pay back on those risks.

Jumping to conclusions like, "the algorithms are sexist," is way too overly
simplistic. It could be that the major weight was gender in this case, but if
that's true, it's probably because the number crunching revealed men in her
husband's demographic were most likely to be unable to pay off all purchases
at once and earn them more money via interest. More likely, it's way way more
complicated than that.

~~~
dcchambers
That's not really how credit works. Yes, the bank wants to make money but they
also want to give money to people they _know_ will pay them back.

------
dwild
I took a look at the husband twitter, and I want to correct a big
misconception, the credit score we can look at isn't actually the same one the
bank used and it can be actually quite different. We actually can't access the
credit score they get.

I don't remember the news article, but if really needed, I'll try to find it
back. They tried it over 4 individuals, and one of them was much lower than
they thought it would be considering what he was saying. They tried to find
out why, which is how they learned they weren't the actual ones the bank was
using and thus tried to get hold of the actual score. The one from the bank
was nearly 200 points better. I don't remember if they tried with the others
3.

I don't believe this was actually her issue, the most simple explanation is
that the income they got wasn't high enough... which has nothing to do with
her gender.

I agree that credit shouldn't be a black box, that they should be able to say,
well your credit score allow you this interest rate, but your income only
allow you this credit amount... I know it's a black box to avoid abuse, but
that's only security by obscurity.

------
jumbopapa
So much outrage for a product where many better alternatives exist. I don't
for a second believe that the Apple Card has sexism built into it.

~~~
atonse
It seems like both though.

The reason it has to do with Apple Card is because it's being marketed as
something that rejects the status quo. Their headline literally says "Created
by Apple, not a bank" – and promises more transparency.

I do agree that this is probably not exclusive to Apple Card. This does expose
larger issues with our financial and credit system. And another really bad
thing that I've had an issue with for years, that people hide behind machines.
"Oh the system doesn't allow me" or "The algorithm said so" – which is a VERY
troubling trend that removes humans from any kind of equation.

------
gojomo
DHH (& now JHH) haven't made a strong prima facie case that any gender
discrimination is involved.

Credit lines, especially those from differentiated lenders (which Apple &
Goldman Sachs definitely aspire to be) will not be a simple function of
easily-observed factors like income, assets and debts – nor even the credit
reports and simple FICO-style "credit scores" of the oligopoly bureaus.

Those will be inputs, sure, but lots of other behavioral history could be
included – anything Apple & GS can get their hands on, really – and the lender
will be estimating not just "ability to repay" but "expected net lifetime
profitability across all services", as a function of the granting of
particular credit lines. Has DHH spent more with Apple over the past 20 years?
That could do it.

As JHH notes, she is "an extremely private person". That right there is also
sufficient to explain a 5x, 10x, or 100x difference in credit-granted. Along
with, say, her partner being publicly-known for buying custom supercars and
Italian vacation homes to park them in.†

†
[https://news.ycombinator.com/item?id=1670712](https://news.ycombinator.com/item?id=1670712)

------
dudul
I've been reading the original thread on Twitter, and saw a few other
testimonies of couples experiencing the same thing. However, what I don't
recall seeing (and maybe I missed it cause Twitter is hard to follow) is an
example of a _wife_ signing up _before_ her _husband_. All the cases were
about the husband signing up first, and then his wife. I wonder if this could
be part of the explanation. Like the 2nd card opened for the same household is
seen as a little more risky or something.

------
judge2020
The big HN thread from yesterday:
[https://news.ycombinator.com/item?id=21494673](https://news.ycombinator.com/item?id=21494673)

------
ryanmarsh
Imagine having a brand so strong that rich people write think pieces framed in
terms of social justice when they have trouble getting access to your credit
products.

For any other credit card a wealthy couple in their shoes would have written
off the credit card company as idiots and applied elsewhere.

I have no issue with the argument against unauditable credit offerings that
disproportionately affect protected classes. I’m pointing out that but for the
brand being Apple you’d never have heard this story.

~~~
ak217
Or perhaps they are leveraging Apple's known sensitivity to this kind of
publicity to highlight a serious issue with equity of credit access and opaque
algorithms running our lives.

------
repler
If the algorithm is (accidentally?) biased, how do you un-bias it without
collecting the data points you are legally prohibited from collecting?

~~~
staktrace
Step 1: publish the algorithm Step 2: listen to feedback

------
pkaye
There is already a law that allows you mention any source of income in your
household for an credit card application. Is Apple in violation of that law?

[https://www.nerdwallet.com/blog/credit-cards/list-spouses-
in...](https://www.nerdwallet.com/blog/credit-cards/list-spouses-income-
applying-credit-card/)

------
lwb
I'm all for improving transparency in the credit system. Woz had/has the same
situation as DHH apparently. My suspicion is that the algorithm takes into
consideration factors such as "are you the co-founder of a highly successful
software business", or maybe it's something like income or total assets you're
legally responsible for or some such.

That said I do wish Apple and all other companies issuing credit would just
come out and say exactly how the algorithm works. Would make a lot of these
conversations easier and less annoying for everyone.

~~~
kitsune_
What is more likely, an algorithm (or their statistical model) rewarding an
illusive category occupied by 0.0001% of the population or penalizing the fact
that she is a 'homemaker' and a woman, categories occupied by millions and
millions of people?

~~~
lwb
If they're penalizing all homemakers and women, wouldn't you expect to see
millions more people getting their credit denied/limited?

------
Will_Do
As someone who worked on these models in the consumer credit industry, it is
_possible_ they there isn't any discrimination. The only thing that comes to
mind is recent inquiries, which have a minimal effect on credit score but are
highly predictive of default. If she applied for a few credit cards in the
previous half year and DHH did not, it would explain the difference without
being discriminatory.

Much more likely, in my view, is that the algorithm looking at something that
is so highly correlated with being female (e.g., Homemaker as career) and
default. This would almost surely fail existing regulatory tests against
discrimination. Since most credit applications ask for household income,
...etc. It is doubtful their applications otherwise looked meaningfully
different.

Edit: Checked the application, and you are indeed _required_ to enter in
household income and not your individual income if you share a checking
account.

~~~
dwild
> The only thing that comes to mind is recent inquiries, which have a minimal
> effect on credit score but are highly predictive of default. If she applied
> for a few credit cards in the previous half year and DHH did not, it would
> explain the difference without being discriminatory.

That would have been visible over the credit score they got afterward.

------
gowld
I'm curious to see a collection of reports of cases like this (large
difference in credit limits of two persons who are married / have same
financial situation).

I've read about two husband-wife pairs so far. What else has been reported?
What about single women vs single men who both have similar financial history?
Is there a "share my salary" spreasdsheet but for Apple Card limits.

Also, I doubt the new Apple Card is totally unique, and this isn't financer
Goldman Sachs's first foray into the personal credit busines. Has anyone
encountered this issue with older credit cards and loan products?

~~~
bdcravens
In DHH's Twitter thread, there were a few people offering opposing data
points.

------
snowwolf
GDPR has given some good thought to automated decision making and has guidance
that I think all companies should follow even if they don’t need to follow the
GDPR regulations.

“We regularly check our systems for accuracy and bias and feed any changes
back into the design process.

As a model of best practice...

We have signed up to [standard] a set of ethical principles to build trust
with our customers. This is available on our website and on paper.” [1]

[1] [https://ico.org.uk/for-organisations/guide-to-data-
protectio...](https://ico.org.uk/for-organisations/guide-to-data-
protection/guide-to-the-general-data-protection-regulation-gdpr/individual-
rights/rights-related-to-automated-decision-making-including-profiling/)

------
dominotw
> “It’s just your credit score.”

I feel like who ever gave her this explanation was undertrained or
unqualified.

She just refuted this silly explanation to prove bias.

------
judge2020
> It’s why I was deeply annoyed to be told by AppleCard representatives, “It’s
> just the algorithm,” and “It’s just your credit score.

Surely AppleCard reps. have deeper understanding into what caused the $57
credit limit, no? Either they do use a blackbox DLNN, their algorithm
literally doesn't have any non-boolean output or other logging (unlikely), or
the author here is omitting the explicit reason for the denial.

~~~
sincerely
When was the last time you called a company and reached someone who wasn't
just being paid to follow a script?

~~~
judge2020
This line makes me think it went higher up in the chain of command

> when the AppleCard manager told me she was aware of David’s tweets

------
Talyen42
She put "0" in the income field, didn't she?

Doesn't take a complicated algorithm to explain that.

------
franze
Startup Idea: Machine Learning / Ai Algorithm Edgecase Monkey Testing as a
Service (AIAEMTAAS)

------
smoser
In the Apply Card Privacy Policy it states that part of the algorithm for
credit worthiness is that it checks your Apple ID for Apple purchases. There
is a good chance that DHH makes all the Apple purchases in his family and thus
he received a higher limit. [https://thetapedrive.com/apple-card-
onboarding](https://thetapedrive.com/apple-card-onboarding)

------
Vaslo
Her post was well thought out and her points make sense. His tweet had the
right intention, but when people tried to offer an explanation (not
justification) as to why this happen, he said they were “mansplaing”. Don’t
understand why he needed to go that route.

------
hartator
Can it be that because DHH has applied first he got most of the credit line
like $20k and when JHH applied she got whatever was left $1k?

------
OzzyB
> Jamie Heinemeier Hansson

Is it common for a wife to take her husband's surname _and_ middle name, or is
this a branding thing since her husband is commonly known as "DHH"?

~~~
crooked-v
Surnames can be multiple words.

~~~
OzzyB
Oh ok, I thought they had to be hyphenated.

~~~
callahad
There's a classic article worth reading titled "Falsehoods Programmers Believe
About Names" [https://www.kalzumeus.com/2010/06/17/falsehoods-
programmers-...](https://www.kalzumeus.com/2010/06/17/falsehoods-programmers-
believe-about-names/)

------
nicholashead
I like DHH for the most part, but sometimes on stuff like this, seems so out
of touch. This isn't an Apple-created problem - credit scores/etc. are deeply
flawed. We all know this. Throwing Apple under the bus is silly, but a big
target I guess, and a juicier headline. Nothing to see here. If you really
want change, go after the credit bureaus and banks backing the cards
themselves.

Claiming an algorithm/process is sexist without specific evidence is also
problematic, along with claiming this is a "justice for all issue"\-- how
exactly is anyone/any corporation required to loan you money in any capacity?

Side note - are folks trying to make this like a bigger debate about
"algorithms" and "machine learning" in general? They do realize they're
different things, right? We're not that dumb as a society--- I hope?

~~~
ak217
Please watch his interview ([https://www.cnbc.com/video/2019/11/11/full-
interview-with-au...](https://www.cnbc.com/video/2019/11/11/full-interview-
with-author-of-viral-tweet-on-apple-card-probe.html)). He explains
specifically why they are calling out Apple in the interview.

Credit access is definitely a "justice for all" issue. Access to credit is a
huge determinant of social success, and there are entire government
departments dedicated to making it more equitable.

In both the interview and the post linked here, there is an explanation of why
opaque algorithms are a problem. So yes, this is about algorithms (if not
specifically machine learning).

Generic non-sequiturs like your last sentence don't really add to the
discussion.

~~~
nicholashead
There is nothing new here, though. Apple isn't the one using "the algorithm",
the banks are. I watched the interview, and it didn't address why they're
targeting Apple other than it's the "Apple card" they're having problems with.
I'm fine with fighting the good fight about transparency on credit factors,
etc - so do that! DHH knows what he's doing poking the new shiny target on the
block that's super recognizable and hot.

Agree to disagree on the credit issue - I don't believe any corporation or
establishment owes anyone any loan/credit-line they don't feel like backing.
There's an indeterminate amount of lenders out there though, I guarantee at
least one of them will cut you a deal, but on their terms. If you're talking
about government lines of credit/loans, that's an entirely different matter.

~~~
ak217
Yes, he's poking a shiny target, _because that 's the only way things will
change_. Just to repeat what is said in the interview: they are targeting
Apple because they otherwise respect Apple as a company and consider it
responsible for the decisions behind its card.

Here is a well-researched article about equitable access to credit and
algorithmic decision making. [https://thehill.com/blogs/congress-
blog/technology/459455-ma...](https://thehill.com/blogs/congress-
blog/technology/459455-making-equitable-access-to-credit-a-reality-in-the-age-
of)

------
snickerbockers
> I care about digital privacy. It’s why I wanted an AppleCard in the first
> place.

what in the actual fuck?

------
RcouF1uZ4gsC
One thing that seems to be in common of the people complaining about sexism
regarding their wives' credit application, is that the men are apply and
receiving credit cards first, and then the wife.

Why are the men always applying first for the credit cards?

Maybe it is not so much sexism as the order in which they are applying? Then
again, maybe it is subtle sexism on the part of the men by always having to be
the first one to try something new?

