
YouTube is trying to reward “quality” content - okket
https://www.bloomberg.com/news/articles/2019-04-11/to-answer-critics-youtube-tries-a-new-metric-responsibility
======
keiferski
Frankly, I hope YouTube (and Google in general) dramatically overdo it with
the censorship, "curation", and deplatforming of "irresponsible" content,
purely so that it forces an independent, non-corporate model to be developed.

Having one company be the sole gatekeeper for the overwhelming majority of the
web's video content is _not_ a solution.

~~~
enitihas
I don't think any new platform is going to be able to compete with YouTube at
all with the current regulations. How will any new platform comply with new EU
copyright laws, or remove unlawful videos as quickly as YouTube? (e.g the
unfortunate New Zealand terrorist attack)

~~~
Mirioron
They won't. I suspect that that kind of regulatory capture is the point of
these laws in the first place.

~~~
enitihas
I don't think so. Most people ( and a lot of people on HN) really believe that
online platforms should be able to suppress such unlawful videos. And it's not
a completely unreasonable position.

~~~
stcredzero
_Most people ( and a lot of people on HN) really believe that online platforms
should be able to suppress such unlawful videos._

"Such unlawful videos." So basically, the government should be able to declare
content "unlawful" and it's okay so long as public sentiment goes along with
it?

Basically, you've simply thrown out Free Speech. I can understand why someone
would find the video of the shooting shocking. Personally, I've chosen not to
view it. However, if Free Speech isn't an absolute right, then it's simply
broken.

 _Habeas corpus,_ the Presumption of Innocence, and due process -- what if the
government could just suspend those things when it felt like it? Those human
rights would simply be broken.

The difference between a just society and an unjust one, is that the just
society extends the same rights and protections to all its citizens, no
infringements, no exceptions. A society with Free Speech extends the right of
speech and the right to read/hear that speech to everyone without infringement
as well.

The interesting thing about the Weimar Republic, is that almost all of the
laws covering human rights had an "out" for the government, allowing them to
suspend the right, "for the common good" in emergency situations. As a result,
the subsequent regime didn't have to change many laws to operate as a
totalitarian regime.

~~~
bonaldi
The Christchurch shooter had no more right of access to a privately-owned
live-streaming platform with an audience in the millions than he had right of
access to a broadcast TV network. Saying so (which I suspect is what OP
intended) is not "throwing out Free Speech"

You can tease apart the right to say something with the right to have it
signal-boosted along the way, and if you want to save Free Speech you're going
to have to in case we throw the baby out with the bathwater here.

~~~
stcredzero
_no more right of access to a privately-owned live-streaming platform with an
audience in the millions than he had right of access to a broadcast TV
network_

If the government now has the right to say, "piece of content X -- not
allowed" arbitrarily and with no process and rational standard, then Free
Speech is simply broken. An event of that magnitude is now history. A free
society should not be rewriting history. People are still free to not-view and
not-read. That right can't practically be infringed. What can be infringed is
the right to view and the right to read.

 _You can tease apart the right to say something with the right to have it
signal-boosted along the way_

As the internet is currently formulated, "signal boost" is equivalent to
discovery. Being able to manipulate discovery for 90% of the population is
tantamount to the power of censorship.

 _if you want to save Free Speech you 're going to have to in case we throw
the baby out with the bathwater here_

In a fair system which truly supports Free Speech, in the end, signal boosting
a bad message becomes a good thing. If it's really bad and full of nonsense,
then there will be a reaction and backlash which will bury the bad message.
What we have in the years leading up to 2019, is a small group of activists
supported by a cultural minority (academia+media+tech) who signal boosted
messages, including those of bad actors who hijacked activist messages. Then,
when the backlash came against the bad actors, there was an artificial
campaign of suppression, which encouraged more backlash against the
suppression.

Now, we are in a vicious cycle of censorship/suppression supported by
government and corporate power versus the backlash to that suppression. If we
want it to end, then we can let Free Speech do what it's supposed to, so the
backlash can go to where it belongs: Against bad actors hijacking good
messages and bad actors with really bad messages.

~~~
ABCLAW
>If the government now has the right to say, "piece of content X -- not
allowed" arbitrarily and with no process and rational standard, then Free
Speech is simply broken.

The key part of this is the word arbitrary. You can't take 'anything down for
no reason', but you certainly can take things down. Even the most robust
version of free speech protections have carve outs.

The government can prevent you from branding your email product 'Gmail', they
can censure you for shouting fire in a crowded theater, they can go after you
if you issue a tweet which lies about how many cars your car company is going
to build in the next quarter, etc.

~~~
stcredzero
_The key part of this is the word arbitrary. You can 't take 'anything down
for no reason'_

 _The government can prevent you from branding your email product 'Gmail',
they can censure you for shouting fire in a crowded theater, they can go after
you if you issue a tweet which lies about how many cars your car company is
going to build in the next quarter, etc._

All of those things have due process and objective standards attached to them.
None of those are arbitrary. Those are all following established law. That's a
far cry from only saying, "Okay, these messages are 'bad.' We're now de-
platforming them."

------
skilled
The Trending page has been listing the same people over and over and over
again for the last 3 years. It's shameful, pathetic, and makes the platform
feel extremely biased.

Seriously though. This is unacceptable and overly controlled. Not to mention
that those people over the last 3 years post borderline brainwash content
without _any_ connection to a real audience.

It's pretty sad to see YouTube turn into something like this.

~~~
jvagner
The Trending page makes me feel really old, and pretty "get off my lawn" w/r/t
what's popular, or being promoted, right now.

A lot of the channels I've followed have gone fallow. No new content.

That said, I think the arguments around this topic are pretty disingenuous.

1) There's nasty content being uploaded to Youtube -- YouTube isn't making the
nasty content.

2) YouTube promotes some content, people are pissed - it's all about the
moneti$ation.

3) YouTube creates a system to stop promoting some kinds of content, people
are pissed - censorship, slippery slope to de-platforming, etc.

There's no real alternative to YouTube (Vimeo isn't it, and the blockchain
YouTube isn't it) for:

a) creators,

b) creators who want to make money.

And looking at what Facebook, Twitter and YouTube have to deal with in the
social sphere, who'd want to inherit those problems? This is why we're still
going to end up with aggregated media empires, because they'll accept the
overhead of running the network, and it will simply get even more shallow in
terms of who supplies "content".

~~~
mdorazio
> b) creators who want to make money.

The only platform I've seen so far for this one to compete with YT outside of
porn is Patreon, and the videos that get monetized there without also being
posted and/or monetized on YT tend to be... less than wholesome in a lot of
cases.

~~~
kkarakk
Patreon only really works for people with an audience so enchanted with the
creator that they would follow them anywhere.

------
legohead
These companies focusing so much on algorithms, when the best solution is to
simply use actual humans.

I've done a fair amount of work on trying to detect and deal with spammers,
fraudulent activity and such. _Nothing_ beats a human eye.

My most successful approach has been to filter out the super obvious stuff,
then send the rest to a queue that is kept an eye on by humans. A small staff
can easily fight back against bullshit. Yes, YouTube is orders of magnitude
above what I've dealt with, but I'm sure their smart engineers can filter out
the majority of stuff.

It's painfully obvious that YouTube lacks human eyes on things. That, or they
lack direction and empowerment of their employees.

~~~
canada_dry
> best solution is to simply use actual humans

Gee, how many humans might it take to review the >400 hours of video uploaded
every minute? (source:
[https://www.tubefilter.com/2015/07/26/youtube-400-hours-
cont...](https://www.tubefilter.com/2015/07/26/youtube-400-hours-content-
every-minute/))

~~~
Hussell
400 hours/minute is 24,000 minutes of video per minute, or 241,920,000 minutes
of video per week. Assuming a human can review 8 hours of video per day, 5
days a week (too high, but not by much), and that YouTube engineers can create
an algorithm that automatically marks 95% of video (probably low, 95% correct
labelling is easy to get; 99% is usually necessary to outdo humans), then you
would need 5,040 humans.

Automatically labelling 97.5% would halve that. If employees can only review 6
hours of video per work-day, then it would increase by a third. Both, you'd
need 3,360 humans.

~~~
mdorazio
And multiple that by let's say 50K/year fully burdened cost per employee
(assuming locating them in low-cost areas) and we've now added a minimum of
$125 million per year in overhead for a service that already probably doesn't
make much money.

~~~
sinatra
> we've now added a minimum of $125 million per year in overhead

Which is <1% of the revenue that YouTube generated last year? Still a
meaningful number. But, not unmanageable.

~~~
mdorazio
At what profit margin, though? I've never seen actual YouTube numbers broken
out in a financial statement.

------
krisrm
I like the intention, and I hope to see more of this... but it's kind of hard
to imagine that the end goal here is "Responsibility", when the business goal
is "how many ads can we sell?"

~~~
Liquix
Valid concerns. What does Google gain from putting genuine effort into
stopping the spread of these kinds of video? Perhaps more "responsible"
content and an eventual turn-around of the platform... But that's a big gamble
- may take years, may never happen.

In the meantime, they're bleeding man hours, money, ad impressions, perhaps
advertisers targeting the audiences which are being culled.

Now examine the question: What does Google have to gain by _appearing_ to put
a genuine effort into this while not actually addressing the problem?

They gain a whole lot more than in the first scenario - new users hopeful of
an increase in quality, support from privacy and attention-span advocates,
insight into how they can influence, suppress, or promote certain themes going
forward, more metrics to become more entrenched in the data harvesting machine
learning game...

------
jatsign
I wonder if they could look at groups of users who up vote "bad" videos (as
marked by their own internal reviewers), see what else they up vote, cluster
those videos, and down promote them in their algorithm.

IOW use the users who like "bad" videos to help find other "bad" videos via an
algorithm.

~~~
twanvl
That sounds easy to abuse. Just create some users, have them like obvious bad
videos, then have them like videos of <insert competitor>.

~~~
ses1984
Literally any algorithm that simple is easy to abuse, you need to integrate
any algorithm with fraud detection and user trustworthiness assessment.

~~~
luckylion
Either Twitter figured that they don't care enough, or it's not that easy.
Mass-flagging is still a thing they haven't solved, and what are you going to
do? Kick out your users because they're trying to manipulate your powerful
algorithm to do their bidding? There's easier ways to make your audience find
a new platform.

~~~
ses1984
It's not that easy, the cost of doing it right is astronomical and most users
don't care.

------
bitL
Quality content takes time to create; many of the past changes in YouTube
algorithms explicitly pushed for churning out low-quality content frequently
and rewarded it, while seriously disadvantaging irregular quality content
(that moved elsewhere in the meantime). You get what you reward.

~~~
degenerate
This can be seen easily if you go to Youtube and search "fortnite", with every
single top video being just over 6 minutes long, full of vapid nothingness
related to the title, and perhaps 20-30 seconds of content related to the
title it originally baited you into watching. The majority of what can be
found through searching/filtering on YouTube is also hot steamy garbage, or
large corporate channels. The recommendation engine sometimes throws in some
gems, but rarely. The only video length filters are "short <4mins" and "long
>20mins". Why on earth? It's like they randomly tried those values in 2008 and
never looked at it again.

~~~
rchaud
Kids watch that stuff and don't yet know how to sift through the trash to get
to the good videos. They're like us in 2007 when we'd all get Rick Roll'd by
blindly clicking on YT links placed on forums.

By setting their KPIs arund "view time", YT created a product which caters
almost exclusively to kids and conspiracy nuts, aka the only 2 demographics
that have seemingly unlimited time to watch cartoons or people talking into a
camera.

------
bitxbit
Please just give us the ability to blacklist channels. That’d increase
usability by 10x for me.

~~~
eropple
AFAIK you can do that? Hit the menu on a video recommendation, "Not
Interested" -> "Tell Us Why" -> "I'm not interested in this channel".

IMO, though, blacklisting has been mostly proven to be insufficient. If you
block, say, noted chud Sargon of Akkad, that doesn't mean that another half-
dozen "Rationalist" "Thinkers" Who Just Happen To Be Chuds won't be
recommended to you as well.

~~~
zwkrt
If you are getting suggested these things it's because people with similar
viewing habits as you watched them. I get suggested riposte videos to such
content without ever seeing the original "bad" video. I'm assuming this is
based on my own habits on the site. I would prefer getting neither, but here
we are.

To claim that blacklisting would be ineffective because there is always more
trash is disingenuous. If everyone blocked (e.g.) PewDiePie, surely they would
get less YouTube talking-head content.

It's like saying that we shouldn't/can't avoid candy because people are always
making new candies.

~~~
eropple
It's well-documented at this point that YouTube's recommendation algorithms
enjoy feeding fascist chuddery to people who like things like video games or
history, yeah. But I didn't say you shouldn't do it. I do it, because one
fewer garbage channel is one more chance for something less garbage. But it
certainly seems like it's bailing against the tide.

------
noego
Honestly, it's pretty ridiculous that YouTube is being pressured to police the
content on its platform. What worries me far more than extremist videos on
YouTube, is the idea of a mega-corporation exerting outsized influence over
the opinions we are exposed to. Is our democracy really so fragile that it
cannot survive an open marketplace of ideas?

~~~
CharlesW
> _Honestly, it 's pretty ridiculous that YouTube is being pressured to police
> the content on its platform._

It's not "ridiculous", it's table stakes. If you ever run an open community,
you'll find that bad actors are inevitable.

~~~
noego
What are these bad actors doing that requires top-down censorship and
marginalization?

~~~
fzeroracer
If you have to ask that question, then you've never been a moderator or ran a
platform before.

There is a lot of content that is invisible to you, the user, because it ends
up being filtered out by moderators because it consists of overt racism, calls
to violence, outright trolling, stalking and more. Even as a moderator of a
small private platform there was a user that was stalked by someone who would
attempt to reregister accounts solely for the purpose of harassing them or
finding any personal details that they could dig up about them

Eventually you have to establish a certain level of moderation for your
platform or those bad actors can and will chase off all of the other users for
one reason or another. This gets worse as a platform scales and the level of
malicious content your platform is exposed to grows exponentially.

~~~
noego
None of the problems you've raised are related to YouTube's proposed changes
in the linked articles. They already have rules against most of the things you
mentioned. The proposed changes go far beyond them.

[https://www.youtube.com/yt/about/policies/#community-
guideli...](https://www.youtube.com/yt/about/policies/#community-guidelines)

> _Starting in 2012, YouTube rebuilt its service and business model around
> “watch time,’’ a measure of how much time users spent viewing footage. A
> spokeswoman said the change was made to reduce deceptive “clickbait” clips.
> Critics inside and outside the company said the focus on “watch time”
> rewarded outlandish and offensive videos._

YouTube's engine is currently ranking content based on the amount of time
other users are spending actively watching that content. Unwanted videos are
already de-prioritized by the current algorithm. The proposed changes are
explicitly intended to de-prioritize videos that people are actively watching.

It remains to be seen how they will measure "quality". If they find a bias-
free way to measure it, I'm all for it. Most likely though, it will be driven
by top-down notions of "outlandishness" and "offensiveness", as opposed to
bottom-up user engagement.

~~~
fzeroracer
You asked what kind of bad actors an open platform might have to deal with and
I gave you an example with a very small platform.

Now you scale that up to match YouTube and it becomes a nightmare. YouTube is
being pressured to take action by it's users because what's happening is
exactly what I was talking about: They have bad actors taking advantage of
your platform. That's where Elsagate came from or the more recent reveal of
pedophiles using the platform to groom kids.

There is no unbiased way of solving this problem. Because you have to
establish certain things as being bad for your platform, which means you are
going to be biased against them. Youtubes reaction is in turn them trying to
solve issues at scale when they clearly didn't consider how the platform would
scale in the first place.

~~~
noego
The "bad actors" that you're referring to on YouTube aren't harassing anyone
or inciting violence or breaking the YouTube community guidelines in any way.
They are simply espousing opinions that you dislike and disagree with.

Sure, YouTube is allowed to do anything they want. But suppressing an open
marketplace of ideas isn't in society's best interests. If unpopular opinions
were suppressed in the past, the movements for women's rights and gay rights
and civil rights would have faced a massive setback. And let's not even get
into the question of whether it's in society's best interests for a handful of
corporations to arbitrarily decide how to censor the marketplace of ideas.

~~~
fzeroracer
I'd like you to take a minute and go visit sites like Voat or Gab. Just, hover
around there for a bit if you haven't. Those are sites that are exactly what
you want: A pure and open marketplace of ideas without any sort of censorship
or rules. This is to prove a point.

Which is that an open marketplace of ideas has no value in itself, because the
marketplace can be very easily taken over by bad actors if you don't exert
some control over your userbase.

~~~
noego
Clearly YouTube has been extremely valuable even before they felt the public
pressure to penalize "low quality content".

Your comment about an "open marketplace of ideas having no value in itself" is
very curious. The very idea of freedom of speech being a good thing is
predicated on the idea that an open marketplace of ideas is a good thing. If
it isn't, you may as well lobby the government to ban any and all speech which
you consider corrosive.

" _When men have realized that time has upset many fighting faiths, they may
come to believe even more than they believe the very foundations of their own
conduct that the ultimate good desired is better reached by free trade in
ideas--that the best test of truth is the power of the thought to get itself
accepted in the competition of the market, and that truth is the only ground
upon which their wishes can be carried out. That, at any rate, is the theory
of our Constitution. It is an experiment, as all life is an experiment. "_

― Oliver Wendell Holmes, Jr.

~~~
fzeroracer
Did you actually visit Gab or Voat? I'm asking you this because if you
haven't, then you can't actually argue in favor of the 'marketplace of ideas'
very well. Either that or you're not willing to argue in the defense of such
sites. Or you're possibly being disingenuous, considering we've switched from
corporate curation/censorship of their content towards government censorship
which are two very different topics.

As I mentioned, places with zero censorship and zero moderation can and are
very quickly overtaken by malicious actors. Either way, if you're willing to
defend Gab and Voat on their merits of being an 'open marketplace of ideas'
after having gone there then we can continue our argument.

~~~
noego
The concept of "marketplace of ideas" goes far beyond Gab and Voat. Two bad
apples and cherry picked examples do not make for a counter argument.

------
malloreon
whatever gets them to stop recommending me BEN SHAPIRO [EVISCERATES /
DEMOLISHES / LEAVES SPEECHLESS] LIBERAL [PROFESSOR / SMART PERSON / SPEAKER]
OVER [SOMETHING BEN IS CLEARLY WRONG / TROLLING ABOUT] videos.

~~~
rchaud
Majority of my YT view history is music. 90% of my recommended videos are
similar genres of music, but there is nearly always some political/culture
war/conspiracy video sitting there as well. The other day it was "The REAL
truth about the Vietnam War". Super relevant in 2019.

Fortunately, I can block channels from my Recommended, but there are hundreds
of similar channels out there, so killing one just means it'll be replaced by
the next nutjob channel in the queue.

------
bobbygoodlatte
I think when critics accuse YouTube of "stepping on the scale" here, they
under-consider how warped that scale is already. The existing YouTube
algorithm isn't neutral or fair, and is laughably easy to game. It's already
biased—it's just biased towards extremes.

YouTube waded into this moral quagmire a long time ago. Being biased towards
"engagement" is just as troubling as having a more human-relatable bias.

Now they're starting to take responsibility for the recommendations this
algorithm spits out. I think that's a good thing.

------
a012
How about metrics that NOT automatically takes down videos on requests, no
questions asked?

------
driverdan
There are so many simple rules that would improve their recommended content.
Examples:

1\. Lower rank for clickbait titles

2\. Lower rank for listicle / compilation / clip videos

3\. Delist listicle / compilation videos with stolen content (most of them)

4\. Lower rank for clickbait thumbnails

5\. Lower rank for YT drama videos

6\. Increased rank for 100% original videos

~~~
dredmorbius
Channel blocking, with recommendation weighting.

If I get recommendations from a channel producing shit, I should be able to
block the entire channel.

Sufficient good-faith blocks dock that channel's recommendation weighting.

The fact this isn't available even when logged inremoves all incentive I have
to log in to YT.

(Which is already near nil.)

------
rhizome
This seems like a re-implementation of television ratings and survey (Nielsen)
methods. In this way, it's interesting how strategies on the commercial
internet gravitate toward business models with a century of momentum (and
student loans) behind them. Business chips away at the revolution.

------
prolepunk
> The video service generates most of its revenue through advertising and the
> business works best when as many people as possible are spending as much
> time as possible on YouTube. Hence executives’ obsession with engagement
> stats. Adding murkier metrics to the mix could crimp ad revenue growth.

This is the core point of it. I've been critical of youtube for some time, but
if they are willing to forgo some revenue growth in to deal with the current
problems, I applaud them.

> The YouTube channel Red Ice TV broadcasts political videos with a “pro-
> European perspective,” which critics label as promoting white supremacy.

Bloomberg, you can call them Nazis. This is not a racist uncle that everyone
needs to deal with, who can sometimes be reasoned with. These people are
literally Nazis.

> YouTube declined to share details on how it uses metrics to rank and
> recommend videos. In a January blog post, the company said it was hiring
> human reviewers who would train its software based on guidelines that
> Google’s search business has used for years.

It's going to be tricky to get this to work, and I'm sure google will make
many mistakes by automating decision making to a degree where it will prove to
be meaningless, but finally there's going to be some human moderation. Thank
you.

~~~
freeone3000
I think a key thing overlooked is "human reviewers who would train its
software..." which usually means throwing out it as a human interface task to
mechanical turk and retrieving the result, then throwing the data into the
input of the machine learning algorithm. This is not the same as having a
human look at videos.

------
kadendogthing
Well the reality is that you don't get quality content generally when you
allow people to publish whatever they want. Everyone has an opinion and
viewpoint. That doesn't make them qualified opinions or viewpoints, and
definitely doesn't make them worth considering. People have significantly
mistaken and waaay over estimated the benefits of mass media in the hands of
your everyday person.

It's becoming pretty clear the cost is not worth any _perceived_ benefits
(because I don't think you can really point to any benefits such models have
given us).

And no, moderating and editorializing content is not dystopian. Whenever
someone has that talking point it's really a case of someone read 1984 (and
sometimes didn't actually read it) and decided to apply it to everything with
absolutely no discretion and absolutely no respect for what the book actually
said.

------
coldtea
> _Creating the right metric for success could help marginalize videos that
> are inappropriate, or popular among small but active communities with
> extreme views._

So, deplatform what's not mainstream, accepted by the establishment, favorable
to Google...

~~~
rwj
I don't think that deplatform is the right idea here. Presumably, the videos
will still be present and watchable. However, the algorithms used to sort and
present videos must be seen as playing an editorial role, and that brings a
different set of questions.

~~~
maceurt
That is the equivalent of deplatforming in a lot of ways, especially on
youtube where millions of videos are uploaded daily. If your videos are not
suggested, and do not show up high in trending or search results, then they
practically don't exist except from word of mouth and outside promotion.

~~~
krapp
Then by definition most videos on Youtube are already "deplatformed," since
the space for recommendations and the front page is limited. Why is this
suddenly a problem?

~~~
maceurt
False. The trending pages and suggested pages are filled based on google's
algorithm that determines which video the user will like/ watch the most. That
is the only fair way to determine trending and suggested videos.

~~~
krapp
How is that fair? Google's algorithm is biased towards what brings revenue for
google. Content that isn't easily monetizable is suppressed, and content
creators are punished for not conforming their content to what the algorithm
wants.

~~~
maceurt
I never said google should rank pages based on monetization value. I only said
based on watch time and how well they predict a user will like that video.
That is the only fair way imo. Youtube mostly ranks based off of watch time
and machine learning to determine how well a user likes a video, monetization
seems to only play a small part.

------
AlexB138
What a dystopian nightmare the internet is turning into. Anything that isn't
in line with "right thought" isn't Responsible and is therefore hidden away?
How did we end up here? How did Silicon Valley go from a place focused on
ideas and building the future to generating systematic censorship and endless
puritanical moral panic? We badly need some new blood guiding tech
development.

~~~
krapp
Why is it a dystopian nightmare for Youtube to curate its content according to
"quality", but it wasn't a dystopian nightmare for Youtube to curate its
content to maximize engagement for ad revenue?

~~~
beaconstudios
both are pretty bad options. Better to let users curate and allow popularity
to dictate visibility, surely?

~~~
krapp
User curation and popularity are, arguably, just as much forms of censorship
as any other, although a "trusted user" curation system similar to Steam might
be interesting. The only truly unbiased system would be random.

~~~
beaconstudios
that only really holds if you consider "not promoting" to be a form of
censorship - at the end of the day, as long as they're not actively hiding
content that YouTube finds to not be "quality" (while still being reasonable
content), it can still be found by someone looking for the subject. I don't
see an issue with that - there will always be limited space for "promoted"
content.

------
monkeybrainsrus
It this point, they should just be regulated by the various world governments.
Though I don't like a world where google can override a given countries law,
or googles tries to solve the issue for union of all the worlds government
laws. I also don't like the possibilites of hidden metrics driving things that
have such huge social implications.

~~~
Liquix
Exactly. If things keep going at this rate Google and Facebook will be
_shaping and distorting reality_ for a large portion of the population.

If 95% of your internet traffic goes through a search box or algorithm... And
you spend 6 hours a day on the internet... And you're not savvy to these
issues (as our readers are)... That's almost 50% of waking life spent
consuming manufactured data which your brain (on some level) interprets as
"real" without question. Terrifying.

------
rc_kas
In theory this sounds a little better. Obviously there is some distance to
cover in implementing it in reality.

------
exabrial
On a side note, the creators of YouTube ought to [at least informally]
unionize. There's no reason Adam Neely should be getting copyright "strikes"
for explaining music to the masses, or Cody's Lab should be forced to take
down videos on mining with explosives [on his family's ranch]. YouTube needs
to have a hotline for problem solving for their most profitable content
providers, and until the creators collectively hit the pocket book, they'll
face these disruptive yet easily solvable problems.

~~~
criley2
MCN's are basically the private version of this.

Content creators join MCN's "Multi-Channel Networks" with the idea that the
MCN will handle some of the more administrative and legal tasks while the
content creators focus on content.

MCN's try to help with the copyright striking and often do have a direct line
to youtube due to their size and importance.

~~~
busterarm
Except MCNs sit as a middleman to you getting paid. The Defy Media debacle
taught everyone a hard lesson.

For those unaware, Defy Media went bust and had taken out substantial loans to
operate. A large number of big creators are still unpaid while Defy Media's
creditors carve up the company's assets.

All those creators are going to get left unpaid.

~~~
criley2
Fair enough, but to bring this case up without mentioning mandatory union dues
and the costs of unionization feels disingenuous.

Unions very much are a middle man between labor and management, and the
collective bargaining agreement that this potential Creators Union would sign
would hamstring a creators freedom in this space pretty significantly as well.

Is what it is, both choices have positives and negatives from my perspective.

~~~
busterarm
Yeah, but at least unions don't take your entire paycheck up front before
giving you your cut.

------
bryanrasmussen
So wait, they're trying to be Vimeo now?

------
sebastianconcpt
How designed responsibility isn't censorship?

------
JohnFen
Yeah, we'll see how well that goes. I can't help but suspect that YouTube's
definition of "quality" and mine are not quite the same.

~~~
commandlinefan
YouTube's (and Google's) management has been very openly left-leaning, so it's
not a stretch to imagine that "quality" as defined by their editors will also
be left-leaning or at least not right-leaning... so, yeah.

~~~
JohnFen
From my perspective "quality" is unrelated to political bent.

