
An FDA for Algorithms? - denzil_correa
http://nautil.us/issue/66/clockwork/we-need-an-fda-for-algorithms
======
jstanley
Maybe a good idea, maybe not, but targetting "algorithms" is certainly too
broad. Everything that does anything is an algorithm. Will we have to get
government authorisation before we can quicksort?

This should be restricted to anything making decisions based on personal
information, if it is to be implemented at all.

Also:

> What digital ecosystem do you personally live in? Apple, Google, Microsoft?

> [...] There’s a big difference between the way that Google deals with their
> photos and the way that Apple deals with their photos. Apple photos are your
> photos. You keep them. When they do facial and image recognition on them,
> they use meta-level features of your data. The way that they’re
> communicating is in the most private way that they possibly can while still
> collecting the data they need.

You don't _need_ to live in some company's digital ecosystem. I don't. I own
my own photos, and Apple are not getting _any_ facial recognition features out
of them.

For someone who is clearly a strong proponent of privacy, it's disappointing
that she doesn't even bring up the prospect of opting out of these companies.

~~~
batty_alex
> You don't need to live in some company's digital ecosystem. I don't. I own
> my own photos, and Apple are not getting any facial recognition features out
> of them.

Workers, patients, and customers don't get this choice. I work on hospital
systems that use "AI" and we wouldn't survive a proper code audit for what we
claim saves lives. We get audited by the FDA, but they aren't combing through
our code - just the documentation that we write.

Sure, we describe the algorithms put together by our researchers, but even the
rank-and-file developers mostly just consume the algorithms that decide what
to tell Doctors taking care of the patients.

For now, at least, Doctors and Nurses mostly use our software as a
retrospective tool. I know we're not selling it that way and I worry about the
day folks are using this system in place of their years of experience.

The statistics we put out say our software is x% better than the Doctor at
making these decisions, is that true? Who knows, but an audit to make sure
we're not selling snake oil might save lives.

> Everything that does anything is an algorithm. Will we have to get
> government authorisation before we can quicksort?

This just sounds like a strawman argument. If you say, 'our proprietary
algorithm will save thousands of lives from malpractice,' you better be ready
to back that up with an audit. It better not be a bridge that's going to fall
apart when an eighteen wheeler travels across it for the first time.

~~~
riskneutral
A code-level government audit of all life-critical software sounds
impractical. There are already millions of lines of code in production on FDA
approved devices (e.g. pacemakers?) so I am sure that standards and
certification processes around this already exist.

------
easytiger
> _It’s never been quite clear, she says, whether the phrase—which is
> frequently the entire output of a student’s first computer program—is
> supposed to be attributed to the program, awakening for the first time, or
> to the programmer, announcing their triumphant first creation._

It seemed perfectly clear to me. "hello world" is the purposed program talking
to the user of the computer on behalf of the service provider who wrote the
application. The second lesson was then to usually request user input from
that user.

> _We need to make algorithms transparent, regulated, and forgiving of the
> flawed creatures that converse with them._

Do we? Do we really? Pray tell why...

> _you could just put any old colored liquid in a glass bottle and sell it as
> medicine and make an absolute fortune. And then not worry about whether or
> not it’s poisonous. We stopped that from happening because, well, for
> starters it’s kind of morally repugnant. But also, it harms people. We’re in
> that position right now with data and algorithms._

No, we aren't. "Algorithms" is being abstracted here to mean, presumably,
large scale high cost data collection infrastructure and logistics and
business. That's not algorithms, it's something else.

> _You can harvest any data that you want, on anybody._

No, you cannot.

> You can infer any data that you like, and you can use it to manipulate them
> in any way that you choose.

No, you cannot.

> _you can roll out an algorithm that genuinely makes massive differences to
> people’s lives, both good and bad, without any checks and balances._

Presumably you mean sell optional, usually free services that people might
want to leverage in return for access to their data. In reality there are only
a dozen or less companies with reach on the scale that you might even start
considering this a problem. Of course not using their services or applying
various legislatively controlled options on those platforms protects you
fairly well. Also not "algorithms"

Perhaps there is some idle minded conflation with the risk associated with
data breaches?

> _A regulatory body that can protect the intellectual property of algorithms,
> but at the same time ensure that the benefits to society outweigh the
> harms._

Insidious at best. The "algorithm" behind academic consideration of how to
employ themselves by intervening in something that doesn't need intervention
is certainly something I wish we could "regulate".

------
OnlineCourage
> You can harvest any data that you want, on anybody

Fear mongering. There are already laws in place about certain kinds of data
such as GDPR, PCI or HIPAA. HIPPA violations can include prison time. Please
be specific with what data and processing of such data causes harm and why
rather than saying, "computers are scary."

~~~
simion314
But companies tried as much as possible to work around GDPR, using stupid
tactics like popups with the option to not allow data collection hidden on
hard to reach settings page.

------
dunslandsboo
From the article:

"I went to go and give a talk in Berlin about this paper we’d published about
our work, and they completely tore me apart. They were asking questions like,
“Hang on a second, you’re creating this algorithm that has the potential to be
used to suppress peaceful demonstrations in the future. ... I’m kind of
ashamed to say that it just hadn’t occurred to me at that point in time. Ever
since, I have really thought a lot about the point that they made."

Not enough apparently

~~~
simion314
>Not enough apparently

Can you expand?

------
yason
I thought this was about algorithms and data structures which made me curious.

But the article is about how intentionally or accidentally sloppy data
analysis or processing should be regulated if it can have a large impact on
people or lead them to erroneous conclusions. Same could probably be suggested
for using amateur statistics used to back up claims, any claims.

------
xg15
She has good points. However, when practically implementing this, I could see
the "any measure that becomes a target ceases to be a good measure" effect
biting back: I imagine many companies would suddenly become very eager to
insist that their algorithms are in fact not algorithms at all but "simply
computer code" or such - and a lot of hair-splitting will ensue about what
exactly is or isn't an algorithm...

(I'm not arguing this shouldn't be done, only that it would be messy)

~~~
mbel
Yeah, the only way I can imagine it working is forcing everyone to open source
their code. It doesn't sound very realistic.

~~~
carlmr
Well you could technically demand the companies show the source to this
hypothetical organization. AFAIK the FAA will take a look at the software that
runs on planes, and how well you can trace requirements, whether the
requirements are tested. Still Boeing and Airbus don't have their code on
GitHub.

~~~
mbel
Alright, I haven't thought about that. Yes, definitely certification is
another option.

~~~
carlmr
Don't get me wrong. I think it's possible, but it would be so costly that you
would need to close nearly every software business in the US.

Safety critical software costs about 100x as much money/time to develop as
compared to non safety critical software. "Algorithms" are everywhere, so
certification costs would make any software shop as costly as automotive and
aerospace. That's just completely misguided. Countries without such
restrictions would take over the global software market quickly.

------
Communitivity
To evaluate whether this will work in practice I looked to the actual FDA and
the Patent Office. The FDA: budget issues, understaffed, susceptible to
political attacks. The Patent Office: budget issues, complete lack of ability
to make fair and balanced decisions on software patents (e.g., the linked list
is patented by Google,
[https://patents.google.com/patent/US7028023B2/en](https://patents.google.com/patent/US7028023B2/en)).
I also am concerned with the broadness of algorithms. Better might be if we
restricted it to algorithms in applications which could affect human safety or
financial transactions. Still, given the past examples, I'm doubtful this
would work well in practice.

------
morekozhambu
Wouldn't that put unnecessary regulations + bureaucracy into the path of
innovation?

Like copyrighting/patenting algorithms and ever-greening comes to my mind.
Though anyone with a PC can comeup with a kickass algo, while not the same as
coming up with a blockbuster drug.

------
toolslive
Anecdote: I once suggested a change to the java standard library to change a
O(n) algo into a O(1) algo. Now, more than 10 years later, it's still the O(n)
(I just checked the std lib's source code). Nobody cares.

~~~
tpxl
This sounds interesting, can you expand a bit?

~~~
toolslive
[https://www.zgrepcode.com/java/oracle/jdk-8u181/java/lang/lo...](https://www.zgrepcode.com/java/oracle/jdk-8u181/java/lang/long.java#L-1445)

can be done via De Bruijn graphs as explained here:
[https://incubaid.wordpress.com/2012/01/24/number-of-
trailing...](https://incubaid.wordpress.com/2012/01/24/number-of-trailing-
zeroes/)

------
amadeuspagel
The track record of the actual FDA is probably relevant here:
[http://www.fdareview.org/](http://www.fdareview.org/)

~~~
zimpenfish
Although take that site with a pinch of salt because it's from the Independent
Institute who have a _strong_ pro-free-market stance.

[https://en.wikipedia.org/wiki/Independent_Institute#Position...](https://en.wikipedia.org/wiki/Independent_Institute#Positions_on_political_and_policy_issues)

------
jack_quack
I run an algorithm in my brain that tells me this is a bad idea

~~~
jack_quack
Actually maybe it's a good idea, my algorithm is probably pretty broken. It
should probably not be allowed on the market. Employers beware!

------
Dowwie
Look to the financial industry if you wish to see how this idea would develop

------
Beefin
Isn't this what NIST does?

------
maxerickson
How about an SEC instead.

~~~
emiliobumachar
Could you expand on why SEC-like regulation would be better than FDA-like
regulation?

~~~
maxerickson
The SEC does things like banning behaviors and requiring transparency, both of
which would help keep a lid on "algorithms" without really stifling
innovation, vs the FDA model where it decides if things are safe before
allowing them to be used.

I guess food regulation is like that compared to drug regulation.

------
aaaaaaaaaab
Ban bubble sort! Call your representative today!

------
booleandilemma
We need an FDA for magazine articles to prevent ones like this from being
published.

------
pmiller2
This has been submitted 8 times in the last 2 weeks. Here is the only other
submission that garnered any comments:
[https://news.ycombinator.com/item?id=18381917](https://news.ycombinator.com/item?id=18381917)

~~~
simion314
Intresting, I did not seen it, I am wondering why it did not stick on the
first page, the book was interesting and learned some new things.

