
Nancy Pelosi and Fakebook Dirty Tricks - mitchbob
https://www.nytimes.com/2019/05/26/opinion/nancy-pelosi-facebook-video.html
======
friggeri
I'm very uncomfortable with the current trend of trying to get big tech to
self police. They'll never get it right because of moving goal posts and
moreover I don't want a private company to have that power in shaping social
discourse.

If we as a society deem that type of content to be improper and that it should
not be allowed on such platforms, then let's make that explicit from a legal
perspective and regulate them.

~~~
daenz
>that type of content

Therein lies the problem. The video was slowed down and the audio slightly
distorted. I don't see a set of metrics that can capture that "type" of
alteration that doesn't also capture tons of content. Unless, of course, you
go down the route of saying "it's illegal to post any altered video or image
of any politician."

~~~
sgt101
What would be the problem of making it illegal to alter video or images of
politicians?

~~~
cr1895
It plainly violates the 1st amendment.

~~~
AlexTWithBeard
This and also sometimes a video has to be edited to make it watchable: remove
the pauses, cut long sentences a bit...

------
Nition
I don't fully understand why people think Facebook should be policing content,
yet no-one thinks the phone company should delete SMS messages that are
spreading misinformation, or the postal service should destroy false mail etc.

It it because Facebook makes it much easier to share something with a wide
audience? Is it because "the algorithm" means Facebook has more responsibility
than if it were a simple chronological feed?

~~~
khuey
tl;dr: yes, it's "the algorithm"

Facebook (and Youtube et al) have chosen not to be content-agnostic
distribution platforms (unlike the postal service and SMS). They perform a
variety of editorial functions to curate content and provide recommendations
with the objective of maximizing engagement and hence advertising impressions
that they can sell. This is popularly known as "the algorithm".

Given that these platforms have chosen to selectively display content for
profit, many people believe that they should have some sort of responsibility
for the editorial decisions they make.

~~~
Macross8299
Facebook isn't an editor, though, it's a curator. You use them interchangably
but I think there's a crucial difference. An editor modifies to inject their
own standards into the work. A curator just includes or disincludes as-is.

~~~
sgt101
Facebook does modify content at the level of the feed; first it includes
sponsored posts, second it promotes posts to people leading them to promote
them to others.

------
ar-jan
Interesting how they're sidestepping the actually rather disfluent speech from
Pelosi that prompted questions. Compare
[https://twitter.com/robbystarbuck/status/1131242531503534080](https://twitter.com/robbystarbuck/status/1131242531503534080)
with the C-SPAN video: [https://www.c-span.org/video/?460990-1/speaker-pelosi-
presid...](https://www.c-span.org/video/?460990-1/speaker-pelosi-president-
trump-took-pass-infrastructure-bill) (e.g. from around 3:00): not edited.

~~~
ar-jan
My my Hacker News! How dare I link to an actual C-SPAN video, huh? Downvote
away! Ignorance is bliss!

~~~
dang
Please don't break the site guidelines by posting like this.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

------
rlt
A bit ironic coming from Kara, who jumped on social media to condemn the
Covington Catholic students before the whole story was out, but at least she
admitted her mistake
[https://twitter.com/karaswisher/status/1087443815269584897](https://twitter.com/karaswisher/status/1087443815269584897)

~~~
Simulacra
The tech media is just as much about gotcha stories as the rest of the media.
Worse so, because the tech media is absolutely dependent upon its access to
the tech industry, which it must always be careful to curb their reporting so
as to not lose access.

------
manfredo
I don't think the idea that companies should host all content except for
illegal content is a better approach. Porn is legal, as per the first
amendment. Does that mean every social media platform is obligated to allow
it? Furthermore it would eliminate the possibility of themed platforms. Things
like pixiv and DeviantArt would have to allow content totally unrelated to
art.

The whole discussion is moot, at least not without a significant
reinterpretation of the fist amendment. Remember, the first amendment also
protects against compelled speech. The government cannot mandate people or
companies to make speech it does not want to. There's some talk about whether
making these social media platforms utilities could allow the government to
compel speech, but that seems far fetched at the moment.

~~~
ianai
Actually, it’s entirely possible to have different legislation for different
sized organizations. The tax bracket for individuals is an example.

I’m more of the opinion that Facebook should be punished as an example. Don’t
grow so big that your unilateral action could have devastating effects on
society. Don’t do things with peoples data that remotely rises to the 1984
totalitarian level. See how fast we have publicly owned holdings companies of
regional wellsfargo/BoA type banks, facebooks, credit reporting agencies, etc.
It’s inline with concepts of having different forms of government at the
local, regional, and federal levels.

~~~
manfredo
> Actually, it’s entirely possible to have different legislation for different
> sized organizations. The tax bracket for individuals is an example.

Still illegal as per the First Amendment. The government cannot compel speech,
regardless of company size.

> I’m more of the opinion that Facebook should be punished as an example.
> Don’t grow so big that your unilateral action could have devastating effects
> on society. Don’t do things with peoples data that remotely rises to the
> 1984 totalitarian level. See how fast we have publicly owned holdings
> companies of regional wellsfargo/BoA type banks, facebooks, credit reporting
> agencies, etc. It’s inline with concepts of having different forms of
> government at the local, regional, and federal levels.

I'm not sure how any of this has to do with content moderation.

~~~
ianai
Facebook has a huge audience. They make it easy to for one entity to target a
huge number of people with customized content per person. And yet people have
some weird affinity to believe anything they see on it. If you instead had to
work with several hundred facebooks across the world to target everybody it
becomes more federated. The local facebooks would have more knowledge and
incentive and resources to police their content for abuse.

~~~
ianai
I can imagine a few ways this may be legally executed. One is to use the 1800s
anti-trust laws to forcibly break up Facebook. Create the smaller entities I
described in the process and say “this is the standard for a social media
company.” Another would be to endow or reinforce citizens private data as some
form of personal property. Give them the legal means to sue and demand payment
for any storage of their PI. Make the costs of offenses steep. I am definitely
not a lawyer though and not as up to date as I should be for this sort of spit
balling.

Edit-I think they could also tie a temporal component and per instance cost of
storing someone’s “PI”. That value could then be a basis to tax the
corporation- create an incentive to “forget”.

Edit2-make a persons likeness and social media posts part of this “PI.” This
would give Pelosi the legal right to the doctored content.

~~~
BaronSamedi
I agree that breaking up these companies is the best option. I can understand
the calls for regulation but that is only treating the symptom and will have
negative side-effects. The size and position of companies like Facebook leads
to a concentration of power which is almost never desirable.

------
_cs2017_
What should be the rule about what videos are banned? We don't want to ban
parodies I suppose? What if the author of the Pelosi video claims it's a
parody?

It would be great if we could allow parodies and jokes as long as a reasonable
person can tell that this is indeed a joke. But that won't work since people
disagree about what's "reasonable".

I guess we can require that parodies and jokes are tagged as such, so people
don't confuse them with real stuff. Is that too overbearing?

Are there any better solutions that have been proposed?

~~~
freewilly1040
It’s not as though you can use the magic p word and then what you did is
parody. What’s parody and what’s not is indeed a difficult line to define, but
it’s not a new problem or a problem that we lack prior art for.

~~~
_cs2017_
> it’s not a new problem or a problem that we lack prior art for.

The prior art I'm aware of comes from the copyright, trademark, and defamation
laws, but the definitions developed there are so vague that it can take a few
months in court (and thousands of dollars) to decide whether something is a
parody / satire. Here we need a rule that can applied within hours, at a very
low cost. So I'm not sure how we can adapt existing legal definitions to
banning videos on the internet.

Could you describe in a bit more detail what exactly you propose?

------
kneel
Facebook didn't doctor the video, out of all the content they could be blasted
on this is pretty trivial.

This is part of a larger trend in MSM to pry eyeballs away from social media
and back to traditional one way media.

~~~
Traster
Facebook knew the video was doctored- it was a mainstream news story for days.
They chose not to content ID it, but instead to continue spreading it. That
seems deliberate.

~~~
makomk
It was also a mainstream news story for days that the White House spread a
doctored video which was sped up to make it look like a CNN reporter attacked
a White House intern. Thing is, it was trivial to verify that wasn't at all
true: if you actually compare the speed with the original frame by frame it's
not sped up anywhere. The artifacts some reporters claimed were from it being
sped up are actually frame rate upconversion from the exact, specific 15 FPS
GIF its original poster claimed as his source.[1]

On the other hand, there was a super-viral video about this on Twitter that
was really obviously doctored. It "proved" that parts had been sped up by
overlaying the video on a 7-frame-behind version of the original. Unless you
paused the video this desync was only noticable in faster-moving sections,
creating the illusion those sections were sped up. Not one news reporter
spotted this. People like Captain Disillusionment happily spread it as fact.

If we based our decisions about what's doctored and should be blocked from
spreading and what isn't on the media reporting, it'd mean treating truth as
lies and lies as truth.

[1] Seriously. You can replicate the incredibly simple frame rate conversion
(just blending the frames on either side) in a few lines of Python and get
every single one of those artifacts out:
[https://www.makomk.com/2018/11/16/recreating-that-white-
hous...](https://www.makomk.com/2018/11/16/recreating-that-white-house-video-
in-14-lines-of-python/) From what I can tell this is the default setting in
Sony Vegas, which is what the guy used.

~~~
mgoetzke
How he made it appear does not matter though. The reporter still did not
attack anyone.

------
hirundo
A partial distributed solution would be for trusted sites to vouch for a hash
of the video file, and the browser to be configured to flag or filter
untrusted content.

E.g. NYT, the DNC, Pelosi's own site, etc., would flag a hash of a file as
valid. Those sites would be aggregated into a web of trust. Users subscribe to
nodes of that web, e.g. I trust reporter A, celebrity B, newspaper C, advocacy
group D ... and their trust webs out to a depth of X.

Is there anything like this in the works? Kind of a DNS service for
propagating networks of trust ratings?

~~~
ng12
Corporate censorship built in to my browser? I'll pass.

~~~
TimJRobinson
You'd decide who you trust.

------
gopher2
So, what's the normative suggestion here? All content on user-generated
websites should be moderated for truth and anything edited, editorialized,
fictional, or not strictly representing reality should be removed?

Or is the idea that only when there is enough public pressure to take down a
particularly popular and deceptive video ... then it should be removed?

It seems sort of ridiculous that we should engage in across-the-board
censorship of anything mis-representing reality in any way. OTOH removing
things only as they become popular enough to go viral and meet some agreed
upon definition of "fake" enough seems like it would establish a permanent,
weird shouting match with no clear rules.

Or is the idea that we should ban and penalize Facebook specifically? Because
they're Facebook. I don't really get that, but okay. We could do that. I don't
think it would solve the user-generated content problem or the fake news
problem.

We could not allow any website with posts, images and comments that isn't
filtered through a some sort of sacred guardian of truth/editorial board. That
sounds like a pretty locked-down version internet.

As some other commenters mentioned, the fact that the President/right-wing
seem into this sort of approach and that this video became popular in the
first place is a separate, sad issue.

I'd love for the NYTimes to spell out what they're advocating FOR as the
solution. If it's just "delete facebook" and read our comments section
instead, well-played I guess.

~~~
PixyMisa
Farhad Manjoo asked that on Twitter. The response was "Hate speech comes
down." which was remarkably unhelpful.

------
bArray
Firstly, if you watch both videos side by side, the effect isn't that great
[1]. I thought she was drunk in the original.

Secondly, do you really want to make Facebook the great filter of political
truth? Is this a role you want to give to a private company with its own
agenda? Of course, for somebody working at The New York Times this kind of
power is a dream scenario, where truth is simply whatever they write.

Criticism of the opinion article itself:

> Facebook decided to keep up a video deliberately and

> maliciously doctored to make it appear as if Speaker Nancy

> Pelosi was drunk or perhaps crazy.

Interesting wording, did Facebook decide to keep the video up or did it not
decide to take it down? An assumption of malice is a dangerous place to start.

> The social media giant deemed the video a hoax and demoted

> its distribution, but the half-measure clearly didn’t

> work.

This is the truth of their opinion, only wiping all traces of problematic
material from the internet is an acceptable solution, anything else is a half-
measure. What if this video was a piece of art, a political statement, a meme?
Is there anyone who we can trust to make the decision without asserting their
own bias?

> “We think it’s important for people to make their own

> informed choice for what to believe,” [..] This is

> ridiculous.

When did a company passing the responsibility of interpretation to the viewers
become controversial? Why is it ridiculous to think of content consumers as
having the ability to decide what is real and what is not?

> Would a broadcast network air this? Never. Would a

> newspaper publish it? Not without serious repercussions.

> Would a marketing campaign like this ever pass muster?

> False advertising.

Newspapers and serious repercussions? You mean a (comparatively) small
monetary fine at most? The biggest consequence of bad journalism I've ever
seen was the phone hacking scandal for News of the World, where Rupert Murdoch
was basically told to reorganize his assets and make the name "News of the
World" disappear [2].

[1]
[https://www.youtube.com/watch?v=sDOo5nDJwgA](https://www.youtube.com/watch?v=sDOo5nDJwgA)

[2]
[https://en.wikipedia.org/wiki/News_International_phone_hacki...](https://en.wikipedia.org/wiki/News_International_phone_hacking_scandal)

------
Macross8299
>so are New York Times articles, because classy journalism looks good on the
platform

Interesting that a journalist thinks that Facebook is optimized for what
"looks good" rather than what drives eyeballs. Facebook is just giving the
people what they want. (To say nothing of how self-congratulatory that
sentence is)

~~~
Despegar
There's nothing self-congratulatory about that sentence at all. Facebook's
algorithm used to promote a certain kind of clickbait until they changed it in
favor of real news (which killed a bunch of media startups that were
optimizing for Newsfeed traffic). Facebook has specifically courted media
organizations with Instant Articles or video for Facebook Watch.

------
warp_factor
that article is pushing the idea that Social networks should be better at
policing speech on their platform.

As said in other comments pushing for more self policing is the worst thing
that could happen for democracy and free-speech. The last thing I want is for
a small clique of silicon valley tech execs to decide what I should or should
not see on my feed.

I see two solutions on this:

\- Censorship is coming from an elected government body

\- No censorship and we let people decide what is news and what is fake news.

But having Zuck and his team manipulate newsfeed to push a political or social
agenda is a terrible idea. This article is a shame.

~~~
bfdm
How are people supposed to make that decision if they are not armed with the
information needed? If they only see the fake version, how will they know to
even question it?

Truth matters. News distributers are expected to fact check and should be held
accountable when spreading falsehoods.

Removing provably false manipulated media is not itself a manipulation. It is
a correction.

~~~
ng12
Agreed, except Facebook is not a news distributor.

~~~
anonymousab
They are a de-facto distributor, even if it wasn't their intention to be one.

------
StanislavPetrov
Thousands and thousands of videos and pictures are edited and altered to mock
famous people and politicians every day, why is this suddenly an issue? Its
astounding to me that anyone would support having overseers like Facebook (and
their government partners at the Atlantic Council and elsewhere) deciding what
is okay for people to post and view.

------
neilv
I think part of the problem here is that Facebook doesn't want to be an
impartial common carrier, so they have to take responsibility for "content".

Then you have one of the most influential venues (which has taken over much of
the use cases of the original open Web and Internet) having to answer to
politicians about what speech it should censor.

~~~
raxxorrax
You forget that the most influential venues all have business ties to Facebook
for visibility reasons. That is a huge problem, since Facebook might indeed be
interested in these venues staying influential and to suppress peoples ability
to meet on other platforms. Hard regulation would fortify their dominant
position.

I remember some years ago that when papers wrote articles about how they will
keep being critical on Facebook, despite their cooperation. That was sad to
read.

------
ddffre
I don't see any problem of posting some humor related content.

------
merpnderp
“No other media could get away with spreading anything like this...”

Meanwhile several major media sites are claiming the video Trump shared was
this same doctored video, when it was merely a montage of regular video clips.

We can’t even begin to have a conversation about censoring private companies
without it immediately being used as a political weapon.

------
DyslexicAtheist
I wonder if it would have taken FB that long to demote the video if it had
depicted Sandberg or Zuckerberg slinging racial slurs.

~~~
tomek_zemla
That is an interesting question and could be empirically determined with an
experiment.

~~~
Gibbon1
That's my suggestion.

------
hsnewman
Propaganda is not new, especially to fascist run governments. I for one am
very concerned about our republic.

~~~
nailer
Please don't use Hacker News for political or ideological battle. It destroys
intellectual curiosity, which the site exists for.

------
m0zg
Established, "old media" news sources routinely spread misinformation as well
either deliberately made up, or by omitting facts that don't fit the narrative
of their owners. News outlets haven't been about "news" for several decades
now, it's all about, to quote Chomsky, "manufacturing consent". It's just that
now anyone with a webcam and a video editor can do the same thing on social
media. Some, in fact, get more viewers than "traditional" media, too.

------
Tsubasachan
America right now is the best show on television. Emmy's for Nancy and Donald.

------
bayesian_horse
We have to kill facebook. They simply know to much! [irony/joke]

------
Barrin92
The most baffling quote from the article, citing Zuckerberg

>“I’m Jewish, and there’s a set of people who deny that the Holocaust
happened. I find that deeply offensive,” Mr. Zuckerberg said. “But at the end
of the day, I don’t believe that our platform should take that down because I
think there are things that different people get wrong. I don’t think that
they’re intentionally getting it wrong.”

Ah yes, the very common good faith nazi. You seriously have to wonder if Mark
Zuckerberg would have such nonchalant views about speech and misinformation if
he didn't have a tail of private security and a fenced in mansion.

What's the Muslim store owner in Myanmar going to do when the mob comes
knocking on the door after some lie about him got the local village riled up?
The degree to which these people live in an alternate reality isn't even funny
any more. Nothing has consequences for Zuckerberg et al and it has distorted
their worldview.

~~~
gfodor
I'm assuming people railing on Zuck here don't like Zuck, so it's odd to me
they would prefer he become the world's arbiter of permitted speech.

~~~
Barrin92
I don't prefer he become the arbiter of speech. I'd prefer the government
would regulate his business and hold him accountable for the damage that lies
and misinformation cause on his platform. Given that he recently called for
more regulation, surely he'll agree.

~~~
chias
> I don't prefer he become the arbiter of speech. I'd prefer the government
> would [...] hold him accountable for the damage that lies and misinformation
> cause on his platform

I'm understanding you correctly, you believe that Zuckerberg should be held
liable for untruthful content hosted on its platform, but you also believe
that Zuckerberg should not be allowed to choose what content is hosted on its
platform.

How do you harmonize those two fundamentally contradictory viewpoints? Or do
you feel that the only "correct" way forward is to excise Facebook in its
entirety?

~~~
Barrin92
I think the platform should be held liable if it causes damage to individuals
by spreading false information about them. For example we could start by
handing out fines to platforms if they do not remove damaging or false
information in timely manner.

I don't think the platform needs to go entirely, I just think it needs to get
an incentive, financially for starters, if it fails to adhere to standards set
by legislators.

~~~
chias
In your ideal case, lets say that Facebook identifies some content that it
believes is spreading false information about somebody. Should Facebook be
_allowed_ to remove it?

If no: Facebook is being fined for not taking an action that it is not allowed
to take, which I can't imagine you are advocating for.

If yes: Facebook is now the arbiter of what information gets to be seen (on
Facebook), which you didn't want.

A third option, of course, would be to let the government (all governments?)
decide when something should be removed from Facebook, but frankly I think
this is the worst option of the three.

I suspect that you meant liability in cases when something is "clearly"
untrue, so my comments likely come off as pedantry. In cases like this video
it's very easy to make that determination. But when it isn't so clear cut,
there will _have_ to be judgement calls, and if we're talking actual liability
then someone has to be empowered make them. Who gets to make these judgement
calls? Either Facebook, the US Government, any government including the ones
that you don't like, or maybe some supposedly non-biased third party. None of
these options sound at all good to me. I don't know what the answer is, I just
want to point out that whatever it is, if it even exists, it isn't as simple
as that.

~~~
gfodor
We all know that when people talk about coercion of removal of online content,
they're talking about content _they_ deem worthy of removal.

You can avoid cognitive dissonance if you assume you have the "right"
perspective on how to bucket content into two groups, acceptable and
unacceptable, and that the bucketing made by regulators will be identical to
yours, since it's clearly the only objectively correct one. Of course, this is
so far from reality if such a thing were to be implemented that it's not worth
taking seriously.

------
IlegCowcat
Facebook invades privacy at minimum. It threatens our democracy.

------
frequentnapper
Why can't Nancy Pelosi sue facebook for $100 billion for allowing a doctored
video about her to spread? They can't use the Napster defense of "we're only a
platform."

~~~
azangru
I am confused. Facebook is a person-to-person(s) interaction. If someone wants
to show their followers a doctored video, why should Facebook prevent it?

(The other day, someone on Reddit reminded that in Russia it’s illegal to
repost the Putin gay clown meme [1]. And of course there is the famous Chinese
ban of Winnie-the-Pooh images. Is this in any way worse than removing the
doctored video of Nancy Pelosi would have been?)

[1]
[https://www.washingtonpost.com/news/worldviews/wp/2017/04/05...](https://www.washingtonpost.com/news/worldviews/wp/2017/04/05/its-
now-illegal-in-russia-to-share-an-image-of-putin-as-a-gay-
clown/?utm_term=.e0c202192e35)

~~~
frequentnapper
There's a difference between satire and downright slander if the viewers can't
tell the difference if something is doctored.

~~~
Macross8299
An average facebook user being able to ascertain the authenticity of something
is a very poor barometer for its authenticity or legality.

Average FB user probably couldn't tell you if Panama papers were authentic or
not, so should those be banned from dissemination as "fake news"? Plenty of
people would like to make those go away as slander, I'm sure.

~~~
sockpuppet999
Please. Don't fall into the trap of underestimating others intelligence. Even
that typical Facebook user from another state has valid reasons and thier
opinion or values are no better than my( or Yours) I'm not trying to argue one
bit okay? But how exactly do you know the Panama papers are legitimate? Do you
have an inside source or did you hear about this issue the same way I did? I
agree with you that it fits perfectly into my own world view and I caution you
and myself to not take stuff like that as a golden source of true info. No
matter how much I'd like to think it's all true, in this day and age one must
be more careful about accepting sources of info as gods honest truth because
there are folks out there who spend thier lives,fortunes and passions just to
mislead or misinform allegedly informed ppl like me( and you)

------
tarcyanm
The correct way to counter disinformation is with information. This does
assume a sufficiently aware populace able to process its own confirmation bias
(which is actually a learnable skill). Sometimes it takes incrementally more
information and that's okay.

Widespread availability of high quality media capture and technologies such as
facial manipulation will inevitably lead to a future with many, many
unverifiable snippets that go viral. This video is unique in that verification
is possible, but the future will likely hold far more unverifiable content
than otherwise. It is more positive to expect that humans will gain the
ability to balance disinformation in the face of further information than to
expect that benign dictators will censor perfect knowledge into cognisance.

