
YouTube seeks 'responsibility' - pseudolus
https://www.latimes.com/business/story/2019-12-27/susan-wojcicki-youtube-responsible
======
pmoriarty
Of much greater concern is that the vast majority of videos on the internet
are on youtube.

It's a fantastic service, and I am awestruck by its potential, but letting one
corporation that's only interested in its own profit control all this
information is a mistake. Its curational snafus and conflicts of interest are
just symptoms of this.

~~~
doh
> Of much greater concern is that the vast majority of videos on the internet
> are on youtube

Nope. Youtube host currently around 6B videos (adds around 1.2B a year).
Facebook is currently around 3.5B, Instagram around the same, Twitter around
2.5B, Tiktok close to 1B etc. The general internet is around 26B videos and
counting.

What makes Youtube important is the fact that most users think of it as a
search engine. You can always find it on YouTube, while it's "impossible" to
search for videos on Twitter or Facebook.

~~~
LunaSea
Facebook contains mostly copies of content hosted somewhere else.

Instagram / Tiktok content is pure garbage unless you're into the teen / faux-
celebrity type videos.

Twitter has some interesting content but it's really hit or miss.

~~~
doh
> Facebook contains mostly copies of content hosted somewhere else.

Not true. Youtube host around 3% of unique videos and everything else are
copies, or from somewhere else or each other. Same applies to many platforms,
but Tiktok and Instagram. Surprisingly, those are the two most innovative
platforms when it comes to new content.

I'm not saying you have to like it, just that their content is unique.

42.98% of content on Twitter is reactionary (gifs/memes) and is under 5s. No
other platform has these kind of number. For instance on Tiktok, only 0.59% is
under 5s and 4.81% for Instagram.

~~~
DaveFr
Where are these stats coming from? I'm curious to find more.

~~~
doh
Look what I do for living [0]. Long story short, we see data on ~100 platforms
and all of their content.

[0] [https://pex.com/web-analytics.html](https://pex.com/web-analytics.html)

~~~
LunaSea
This isn't proof and looks more like an advertisement.

~~~
doh
Not sure what kind of proof would you like. We built our whole company around
this kind of data.

------
jedieaston
It seemed apparent that the video host and the video recommendation/curation
systems just need to be two entirely separate parts of the puzzle. The host
can be transparent, allowing anything that isn’t unlawful to be posted but
with no way to discover new content without already learning about it from
another source. (This should probably be decentralized, but IDK if the current
efforts are good enough for video at YouTube scale)

Then, you can rely entirely on other social media outlets or a indexing
service (that can preferably index video from multiple services, like TiVo
used to do and Apple is trying to do right now. Or how torrent trackers work.)
to give you access to it. If you think this video curation system isn’t to
your standards, you could just change to another one or build one yourself.

~~~
zxcmx
The conflict is that curation / recommendation is profitable because it
"closes the loop" on the skinner box, you can push towards content that will
keep people watching or that makes more money.

But Google don't want to be responsible for "recommendations" their own
algorithms generate.

So they need to figure out a way to make responsibility for the actual content
somebody else's problem while still being able to "optimise for engagement"
a.k.a direct the click stream.

~~~
chii
tldr; google wants to reap the rewards, but externalize the costs.

------
xg15
I think Silicon Valley - or the internet-using part of society at large -
urgently needs to discuss what "neutral platform" should mean in the first
place.

While freedom of speech and freedom of enterprise are important, we expect
there to be clear responsibilities and basic safety rules in any other part of
society. Yet, the "neutral platform" concept seems a tool designed to do away
with responsibility altogether:

E.g., in case of disturbing videos available to minors, YouTube is not
responsible because they are a neutral platform, users are not responsible
because they are minors and uploaders, while responsible in theory, can easily
stay anonymous and unreachable for anyone. It seems pretty clear to me that a
system like this is easy to exploit.

------
ENOTTY
Revoking section 230 of the Communications Decency Act (CDA) seems like an
incredibly blunt hammer to apply to a problem that I’m not convinced anyone
fully understands yet. (Indeed the more cynical part of my brain believes
legislators are seeking to punish YouTube/Google rather than identifying a
harm and mitigating it.)

Surely there’s a scale by which liability can be applied. Suppose this
principle could be achieved: The more a company intervenes into a neutral
platform, the more that intervention is subject to regulation or liability.

When applied to the YouTube situation, it could be that YouTube faces no
responsibility in policing user uploads, but YouTube does face some
responsibility in policing the output of the recommendation algorithm

------
thepangolino
I love the coinciding timing of this submission with that other one:
"YouTube’s Rabbit Hole of Radicalization [pdf]"
[https://news.ycombinator.com/item?id=21902416](https://news.ycombinator.com/item?id=21902416)

------
tus88
Sounds almost as fun as "safety". The things these PR people come up with!

------
neonate
[http://archive.md/vG6CL](http://archive.md/vG6CL)

------
deogeo
The article does its best to avoid the phrases "free speech" and "censorship".
They get only a single, off-hand mention, as part of someone else's position.

~~~
shadowgovt
Nobody has any free speech rights related to Google-hosted content. If Google
pulled the plug on YouTube tomorrow and vended 0% of the videos it hosts,
there'd be no recourse for content creators. Nothing about the US
interpretation of free speech implies an obligation for YouTube to host any
content or no content.

The only reason a question of "censorship" matters is because the safe harbor
legal protections American content providers enjoy include a certain (poorly-
defined) objectivity regarding their hosting decisions.

~~~
darawk
> Nobody has any free speech rights related to Google-hosted content. If
> Google pulled the plug on YouTube tomorrow and vended 0% of the videos it
> hosts, there'd be no recourse for content creators. Nothing about the US
> interpretation of free speech implies an obligation for YouTube to host any
> content or no content.

People always say this when this topic is brought up. But nobody is talking
about the constitutional _right_ to free speech. We all understand that Google
is _allowed_ to do whatever it wants on its own platforms.

Free speech, however, is not only a right. It is an ideal. It is something
some people value independent of the constitution and law.

~~~
shadowgovt
Free speech is valuable for its utility; it helps hold power accountable and
can open up avenues for seeking truth of the world.

But it can also hinder those efforts, and speech in that direction is no ideal
to be upheld.

~~~
darawk
So, you're for free speech as long as it's the kind of speech you like?

~~~
shadowgovt
As long as it's the kind of speech that isn't harmful to people. Whether I
like it is immaterial.

the right to free speech doesn't protect the right to yell fire in a crowded
theater. Even the nations with the most liberal interpretations of the meaning
of free speech have put constraints on it.

------
throwGuardian
YouTube would never have gotten where it is, if this newfound puritanical
quest had bitten it before. And looking back at this moment in it's history,
YouTube of the future will regret stymying what was a thriving platform, both
economically and freedom of speech wise.

If Google wants to turn YouTube into a Hulu/Netflix competitor, it should
concentrate on it's Google play video brand, and simply drive YouTube searches
there for certain mainstream-studio content

