
A former Facebook engineer on algorithmic ranking for Facebook Trending - smoyer
https://www.linkedin.com/pulse/algorithm-transparency-paying-attention-man-behind-curtain-koren?trk=eml-b2_content_ecosystem_digest-hero-14-null&midToken=AQE5WL03u_Wx3w&fromEmail=fromEmail&ut=28uObBJ6PtfTg1
======
ucaetano
> _To Facebook’s credit, they’ve chosen to be radically transparent about
> this._

Wait, they chose to be radically transparent AFTER they were denounced?

Kinda like Theranos, who's being really honest about the quality of their
tests, AFTER they've been shown to be worthless?

~~~
maxerickson
I actually don't have trouble believing that they had failed to carefully
consider the (broader social) implications of what they were doing.

I think I find that more believable than attempting to build an elaborate
system for manipulating user opinions.

If anything, I think they are biasing the feeds in ways that garner clicks,
like discussed here:

[https://twitter.com/zeynep/status/733397126026297344](https://twitter.com/zeynep/status/733397126026297344)

~~~
ucaetano
I agree, I don't see this episode as a systemic, top-down, news manipulation
initiative, but more as a deep lack of self-awareness and reflection on the
network's impact on what people see. And that hasn't been addressed yet.

------
jasode
_> What really made this article blow up was the revelation that humans were
in the loop._

To me, this post by Jonathan Koren doesn't really get to the heart of the
outrage by some observers. Yes, humans utimately need to be in the loop for
content filtering, but the fact that they seemingly curate based on
_political_ agendas is the real controversy. Koren addressed the human part
but ignored the political.

If we just isolate the "humans must oversee algorithms", I think it's easy to
convince people of that. The user generation of stories and determining
relevance looks to be a real-world variation of The Halting Problem.
Therefore, you can't write a function f(x) where "x" is topics and "f(x)"
returns "good story" or "bad story". You can't just let the algorithm run
unattended. Hackers could exploit an unknown idiosyncrasy in the content
algorithm and all of sudden, everybody's feed has "child pornography is
awesome" stories in it. It doesn't matter if you take the latest cutting-edge
text analysis algorithms from whitepapers and apply teraflops of Tensorflow on
it, humans will still have to be in the loop to keep unexpected behavior from
happening.

With that necessity of humans conceded, you then consider if the curators have
abused their power into a pattern of bias. The other "guidelines" blog post[1]
that he linked to doesn't actually address the _real_ transparency that people
would love to see: The list of (conservative) stories that curators sent to
the bottom of the heap.

Unfortunately, making the list of "suppressed stories" transparent can also
have the opposite effect of making them prominent... aka... The Streisand
Effect.

[1]Justin Osofsky, VP Global Operations: _" Facebook does not allow or advise
our reviewers to discriminate against sources of any political origin, period.
Here are the guidelines we use."_

Outsiders skeptical analysis would be: Publicizing "internal guidelines" is
not the same as publicizing the actual list of suppressed stories so people
could judge for themselves if the "no politics" guidelines was followed.

~~~
caseysoftware
> Koren addressed the human part but ignored the political.

Yes, but most people who are only sort of paying attention didn't catch the
political part. So now when someone says "Facebook has an agenda against X!"
the response - vocalized or not - becomes "no, they talked about that
already."

From a PR perspective, it's not a bad move.

Towards actually addressing the issue, not so much.. especially since these
published guidelines don't include (or at least I can't find), the blacklist
numerous insiders have claimed exists, specifically around Twitter and
Facebook itself being in Trending.

------
spoiledtechie
What a Crappy article.. My outrage does come from the political bias, not the
algorithm. I have since ceased using Facebook.

~~~
jjuel
Do you rely on the trending sidebar on Facebook that much? If you were to stop
using everything due to a "political bias" you probably will have to move to
an off grid cabin deep in the woods.

~~~
smoyer
Hmm ... [http://editions-hache.com/essais/pdf/kaczynski2.pdf](http://editions-
hache.com/essais/pdf/kaczynski2.pdf)

------
badmadrad
Its nice to get a perspective from inside but the issue isn't human
intervention its rampant human bias.

~~~
Cacti
The issue is people get their news from Facebook...

------
smoyer
A note to the moderators ... The title I included above is what LinkedIn sent
in this morning's news feed. Since I think it's more descriptive than the
page's title (Algorithm Transparency), I used the feed title.

------
mkay581
_" if you want have computers do everything, for technical reasons, resource
limitations, and product positioning, you may want humans to oversee the
algorithms"_

whaaa?? That's exactly why you would NOT want humans to interfere.

 _"...because computers are incredibly stupid"_

LOL and humans aren't? Who is this guy?

~~~
maxerickson
The outputs of the system will reflect the choices that humans made about the
inputs and processing. It makes a whole lot of sense to have humans taking a
look at the outputs and making sure they are reasonable.

You obviously don't need humans reviewing the outputs of simple mechanisms,
but as your mechanisms get more complex and opaque, the pretense that they are
objective becomes ridiculous.

~~~
elthran
This. You only have to take a look at what the internet did to Microsoft's
learning chatbot to see why humans are a good idea.

~~~
Kristine1975
OTOH I only have to look at Microsoft's chatbot to see why humans are a bad
idea, too ;-)

------
fintler
What does everyone think about LinkedIn's publishing platform?

------
shmerl
I can't access the article:

 _> Not a member? Join now_

