
A brief history and future of credit scores - antigizmo
https://www.economist.com/international/2019/07/06/a-brief-history-and-future-of-credit-scores
======
neonate
[http://archive.is/NtfoT](http://archive.is/NtfoT)

~~~
ryeights
Thanks, economist.com is terrible

~~~
maccard
Meta comment but why do you say this yet read their content?

~~~
parliament32
Their content is good but their site design is bad. Good thing we have nice
users like GP who'll post the content in a readable form.

------
swsieber
There's commentary abiut incidentally discriminating based on race (e.g. zip
code aa an input can act as a proxy for race).

Would giving out more loans than is rational by excluding stuff like zip codes
be a good thing? Wouldn't that lead to more defualts among those groups of
people zip codes can discriminate against?

~~~
btown
This brings up a bit of a "trolley problem" of an ethical dilemma. Say you're
asked to create a machine learning system, but you know that the data quality
is so poor that you're very likely to overfit in a way that will deny economic
opportunities to underserved communities that currently have them. But if you
don't create that system, you're denying economic opportunities to _other_
underserved communities that currently do NOT have them. Do you take the job?
Moreover, if you'll do _less_ harm than someone else who might be hired, does
that make a difference?

There's no Hippocratic oath for our profession, and in many ways that's
important, because we create systems whose impact may very well outlive us and
out-scale anything that a single medical professional could do. But that also
doesn't mean we should operate in a utilitarian environment without
constraints.

~~~
lifeisstillgood
>>> no Hippocratic oath for our profession

Sadly there is no profession for our profession. I often think the model for
any software profession (if we can create that - something I doubt) is railway
engineer - where the professional signs off on the safety / completeness of
work done on the railway - and that no train can travel without it. It leads
to plenty of uncompetitive practises - but also to ... y'know ... people not
dying in crashes.

How we start that is hard (probably something to do with safety critical
software systems) because we aren't too sure what is _the_ right way to build
software.

And then we have the fun problem of the members of the profession trying to
decide the answer to your trolley problem. Sorry scratch that. The various
legislatures proscribing the answer and the profession trying to implement the
conflicting results !

------
lifeisstillgood
I often think _the_ most effective means to increase loan repayment rates is
to provide the debtor with a effective accurate money management tool - sort
of like Mint but better. I am supposedly a well educated intelligent software
engineer and yet trying to get a single unified view of what I spent is
outrageously challenging or requires discipline at a level of dieting.

Apart from fraudsters, people who take out a loan want to pay it back but like
dieting human failings cause the problems.

Just a instant check on what you have spent globally will make a huge
difference in budget management.

~~~
cjsawyer
That’s part of the problem. A budget is a plan, not a running total. You
budget what you can spend then change what you do spend to match. (If only it
were that easy, though. I’m human too)

~~~
lifeisstillgood
but an accurate and up to date running total of how you match to budget helps
enormously (even if the budget is the default of "don't spend it all"

------
teekert
I think it is important to realize that this is really a US thing. In my
country people have a mortgage and that is usually it. The rest, we save money
for and buy it when we have the money.

Our mortgages are pretty bad though and rent combined with housing prices
pretty much adapt to the median income, imo, when rent goes down, housing
prices shoot up to compensate and get monthly costs back at the same level
where a median family need two working parents for a reasonable home in the
city roughly. Now, rent has nowhere to go but up and housing prices will fall
again, I predict.

~~~
maccard
It's not really a US only thing - based on your username I'm assuming Dutch? -
a quick Google says that there was 6 billion euro in new consumer credit in
the Netherlands last year, and it's been much more than that even in the past
decade. People in the Netherlands seem jsut as likely to buy cars on finance
and rack up credit card debt as everyone else.

~~~
teekert
Oh.. must be my bubble then :)

~~~
brighter2morrow
Also, due to massive migrations the distinctions between countries are
disappearing. So traditional Dutch may behave exactly as you say, but if
immigrants from the same source go to different countries (like Netherlands
and US) but act similarly then they will push these economic metrics towards
each other across their host countries.

------
sbmthakur
[https://outline.com/zq5nfd](https://outline.com/zq5nfd)

------
diminoten
The key here being that non-financial data isn't actually useful in predicting
ability to repay loans. I'm sure it'd be used somehow by modern financial
institutions if it were predictive.

~~~
dvdbloc
I find this interesting because when I applied for a mortgage at Wells Fargo
they asked me for my race in the application process.

~~~
dd36
I get confused by those they claim some of these ML or NN algos are black
boxes and so could be breaking the law. If you’re not inputting illegal info
(like race, sex, national origin, religion, name, etc. or corollaries), then
it’s not making its risk assessment on that basis. All you have to do is look
at the inputs. It isn’t unknowable whether or not it’s breaking the law.

~~~
emadelwany
Let me try to clear your confusion: an input my seem innocent (e.g. zip code),
but a zip code is likely to correlate to ethnicity and race in some regions.
So even if the inputs seem legal, an ML model that’s sophisticated enough can
derive illegal results that discriminate against certain populations.

~~~
mieseratte
Are there any real-world examples of an ML system causing protected-class
discrimination based on non-protected criteria?

~~~
jetrink
A recruiting tool used by Amazon developed a bias against women despite not
being told candidates' genders. It penalized candidates who were graduates of
all-women's colleges and also those who had the word "women" in their resume
(e.g. “women’s chess club captain.”) It had been trained on resumes submitted
to Amazon during the previous ten years, so the tool's bias was likely
reflective of real human bias in Amazon's recruiting process.

1\. [https://www.reuters.com/article/us-amazon-com-jobs-
automatio...](https://www.reuters.com/article/us-amazon-com-jobs-automation-
insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-
women-idUSKCN1MK08G)

~~~
tareqak
From my reading of that article, I think the recruiting tool was fed resumés
and a data point saying whether or not the corresponding candidate was hired
or not. As a result, the tool not only developed a bias against women, but was
effectively evidence that there was bias against women in the original hire /
not hire decisions.

~~~
tareqak
I _just_ missed the deadline to edit my post, so I am replying to myself.

Looking at the parent comment again, I seemed to have just restated it without
adding anything new of my own. I meant to add that my reasoning for why Amazon
pulled development of this tool was not just because the tool’s bias, but also
because that the existence of the tool and its associated training data could
open up Amazon to litigation claiming that their hiring decisions were biased
against women in ten year span referred to by the article.

~~~
AnthonyMouse
It's interesting that nobody even bothered to check whether the bias was
illicit. They found something that sounds bad and the immediate response is
"OMG bad PR, pull emergency shutdown."

They just assume that "women's" is coding for female candidates and not
something more specific, like gender-segregated activities that may
legitimately produce lower quality candidates than the equivalent integrated
activities that exposed the student/candidate to a more diverse cohort
population. Probably also doesn't help that some of the biggest gender-
segregated institutions are penal in nature, i.e. "reform school for troubled
girls" or "women's correctional facility."

Did anybody even check whether it also penalizes words like "boys" and
"gentlemen's"?

~~~
tareqak
To be fair, it would probably take more time and money irrespective of
litigation to be sure that there was illicit bias than to just delete
everything and call the exercise a failure.

------
zoboomafoo12
Using telco data to create credit scores based on prepaid sim activity in
developing countries: Juvo.com

------
tedmiston
[paywalled]

