
Algorithmic accountability - __Joker
https://techcrunch.com/2017/04/30/algorithmic-accountability/
======
Breefield
TTW I asked Eric Schmidt if he had seen Gattaca:
[https://www.youtube.com/watch?v=KXk8MJGHaEg](https://www.youtube.com/watch?v=KXk8MJGHaEg)

For context, he was talking about letting algorithms run wild on our genome to
reduce risk of health problems and genetic disease, then using CRISPR to
reintroduce these benefits into our genes.

I ultimately wish I had asked the question more directly, but I feel like the
response was ultimately "we won't do that because society won't let us", when
what I wanted to hear was "one would need to be incredibly mindful about
ethical dilemmas surrounding the proxy of algorithms tell us what's best for
us".

------
jawns
I was especially interested to read this after reading yesterday's NYTimes
story about an opaque, for-profit algorithm that is being used to help judges
sentence people:

[https://www.nytimes.com/2017/05/01/us/politics/sent-to-
priso...](https://www.nytimes.com/2017/05/01/us/politics/sent-to-prison-by-a-
software-programs-secret-algorithms.html)

HN discussion:
[https://news.ycombinator.com/item?id=14238786](https://news.ycombinator.com/item?id=14238786)

The algorithm discussed in the NYTimes story purports to be able to predict a
criminal defendant's risk of violent reoffending, recidivism, and pretrial
risk. It does so, apparently, by examining historical data, looking at the
outcomes of other people who match the defendant's profile, and on the basis
of that information providing a probability score.

As I mentioned in my comment on that post, it is entirely inappropriate to use
an algorithm that, at best, predicts only _aggregate risk_ to impute
individual risk, especially when that probability is being used to determine
something as serious as a prison sentence.

And as this TechCrunch piece points out, it's not just this specialized
software that has potentially undesirable consequences, but the everyday
algorithms used by Google, etc.

One thing that would help is transparency. But we are very quickly getting to
a point where lack of transparency is not merely a business decision, but a
problem inherent in the technology we're using. A machine-learning model might
not be easily interpretable. Its own creators might not be able to fully
understand why it's making the decisions it's making.

We may soon get to an uncomfortable place where certain technologies are
effective, but nevertheless must be avoided because the cost of making them
sufficiently tunable to address ethical and regulatory requirements becomes
prohibitive.

------
amelius
Big companies should not be allowed to have our data. They should only provide
the means (i.e., hardware) to run algorithms on the data. Those algorithms
should be created by academia and not-for-profit organizations. It worked well
in the beginning of the internet, so why not go back to this scheme, if
necessary by changing the law?

~~~
danso
How would a company like Facebook work without having our data? Or Google, for
that matter? "work" as in actually function? Their services are only possible
through the storage and analysis of data.

~~~
amelius
Well, academia or government organizations will probably come up with some
kind of federated protocol for social media, just like they did with email in
the old days. This protocol can then be run by everybody who owns a server (or
cluster of servers, depending e.g. on whether the algorithm allows a peer-to-
peer configuration).

In this scheme, a company like Facebook would not exist. There could be
multiple providers (like hotmail/gmail for email), but it would be easy to
switch to a different provider. Or you could start your own (or government
agencies could do that if they feared abuse of their citizens' data).

~~~
bluesign
Federated protocols works in theory but except email they didnt get a traction
and all replaced with propriety protocols. Email survived because it was
ancient and it was main identity provider for web.

Ofc i wish a federated protocol can catch up but seems to hopeless with
current market situation.

