
Burning Digital Books and the Fight over Online Ideology - theNJR
https://www.nicholasjrobinson.com/blog/culture-2-0/burning-digital-books-and-the-fight-over-online-ideology
======
Causality1
The quest for profit has resulted in companies giving us the ability to curate
everything we see even when we think we're getting a random selection of the
latest news. Every time we go online we remake the world in our own image,
only reflecting ourselves. I'm tired of the constant stream of news stories
designed to make me feel smugly correct or spark righteous fury. I'm tired of
being told I'm right every moment of every day.

~~~
criddell
Maybe eventually news outlets will have enough information about you that they
will figure out how to serve you better.

I mostly use Apple News for reading news and after a few months of marking
stories as good or bad, it's getting better and better at showing me stuff I
actually want to see.

~~~
devoply
Yes the anti-vax stories are also on top of my lists. I know the elite rather
have me seeing stories about how Iran is evil and needs to be liberated, but I
prefer these stories. /s Maybe if everyone just shut up about everything we
could all get some peace. Those who know don't say and those who say don't
know.

~~~
lixtra
If you stop reading then you’ll live in a world where everyone shut up.

------
IAmEveryone
Does anyone believe we have gained more with this "democratisation" of
publishing than we are losing to conspiracy theories, hoaxes, and populism?

Because until ca. ten to twenty years ago, the "establishment" acted as de
facto gatekeepers for the public discourse. And while that does strike me as a
bad idea in principle, I really couldn't argue that they didn't do a far
better job than today's cacophony.

Sort of like presidential nominees: smoke-filled backrooms certainly are not
perfect venues for decision-making. But they nominated Eisenhower and
Roosevelt, while we can't find anyone better than you-know-who?

~~~
creaghpatr
Which conspiracy theories, hoaxes, and populist ideas would you ban from being
accessed by US civilians if you had the moral or legal power?

~~~
IAmEveryone
I wouldn't ban anything. What we need is a bit of trust in some institutions.
There is a rather obvious difference in quality going from, say, _The New
England Journal of Medicine_ to [http://thespians-against-government-mind-
control.hoax](http://thespians-against-government-mind-control.hoax).

It is impossible for you to independently verify even a tiny fraction of what
you read/hear on a daily basis. Therefore, you need proxies that can be
trusted. These proxies can be evaluated by their performance over time, and
there should be many, and they should compete for your trust.

This is exactly how everybody is already operating, almost all the time: when
your spouse asks to borrow your car, you will probably give it to them.
Because they have a track record of giving it back, and a long-term interest
that outweighs any momentary impulse to just sell it and go on vacation. If a
stranger on the street asks the same, you will be more sceptical.

~~~
spaginal
That’s a utopian view though. Institutions that were trustworthy became
untrustworthy over time because of their own current actions. Who is in charge
of establishing truthiness?

Quis custodiet ipsos custodes. Who watches the watchmen.

The only logical and rational way to disseminate information in a free society
is you put it all out there and let people make up their own damn minds.

I know it doesn’t appeal to the control freak side that is scarily manifesting
itself everywhere lately, but if our schools were more focused on education
instead of politics, logic instead of propaganda, we wouldn’t need to control
narratives, we could just entrust people to make their own decisions.

------
cortesoft
> But this is where it gets tricky; Every major scientific theory ends being
> wrong over time.

This is just not true. Many scientific theories end up being wrong, but not
all. Many are refined, and added to, but not all end up just being wrong.

~~~
Jun8
We should keep posting this excellent rebuttal from Isaac Asimov to stamp out
this misunderstanding:
[https://chem.tufts.edu/answersinscience/relativityofwrong.ht...](https://chem.tufts.edu/answersinscience/relativityofwrong.htm)

~~~
theNJR
That was a great read!

~~~
lixtra
It’s kind of confusing if you adjust your article during a discussion whiteout
marking it.

~~~
theNJR
Any suggestions for how to best do that? I like the idea of living documents
(to a degree).

~~~
lixtra
You can place an “edit” in front of your change and even link it to the
comment that caused the change. That has the added benefit of giving credit.

Of course not in front of each typo.

Here: > Every major scientific theory ends up being wrong over time (edit:
sort of).

I agree that your already read-worthy text gains through interaction.

~~~
theNJR
Changed the "sort of" link to this very thread. Let's get meta!

------
scandox
> sets a scary president for alls voices that are other

I can't read past this kind of error. It's a typo too far for an article that
is asking me to take it seriously.

Proofreading is "proof" that the author has taken care of both their language
and their ideas.

~~~
theNJR
Fixed. Thank you for holding me to a high standard, truly.

~~~
scandox
Oh good on you. I'll carry on from that point then.

~~~
scandox
Ok so while we're here:

\- preveriable -> proverbial

\- rediculuded -> ridiculed

\- bares -> bears

\- By unleashing a never-ending stream of fiction, masked a truth -> By
unleashing a never-ending stream of fiction, masked as truth

~~~
gambler
We have AI that supposedly can "compose" articles, but there are no open-
source grammar-aware spellcheckers. I love modern technology.

And yes, this _is_ one of the projects on my 1000-item todo list.

~~~
theNJR
Have you ever tried Grammarly? I see them advertise on Hulu, not sure if its
any good.

~~~
gambler
They send all your texts to their servers for analysis. I don't find that
acceptable from privacy standpoint, even if I'm just writing publicly visible
comments.

------
FutureSpec
This reads like someone's college freshman essay. It's just a recap of
existing knowledge, with no concrete call to action.

~~~
4RealFreedom
I enjoyed the article. I do not believe every article needs to include a call
to action. Describing the situation can lead others to come up with ideas for
solutions.

------
IAmEveryone
I don't quite understand how the proposed "diversity of platforms" is supposed
to help?

Ending up with a "liberal facebook" and a separate "conservative facebook"
seems like the worst possible outcome. It will just reinforce tribalism, and
destroy any remaining resemblance of a commonly trusted source of truth. There
will be two sets of economic indicators every quarter, and both will be wrong.
At that point, political discussions will just be fan-fic.

The root of the problem seems to be a large fraction of people opposed to
either the existence, or at least the possibility to adjudicate, truth: "Some
people say the earth is round(ish). Others say it's flat. Who am I to judge?"

~~~
AnthonyMouse
> Ending up with a "liberal facebook" and a separate "conservative facebook"
> seems like the worst possible outcome.

Indeed. Which is why two is still the wrong number, and the right number has
three or more _digits_.

------
jlkuester7
It is interesting to consider how federated social networks (e.g.
[https://joinmastodon.org](https://joinmastodon.org)) fit in here. They seem
uniquely positioned to be able to thread the needle of broad inter-
connectivity combined with more localized control.

------
AzzieElbab
Personally I view given example of antivaxxing movement as a side effect of
lacking research on rise of autism in kids. Muting these people is immoral and
will backfire badly

------
jerf
I am increasingly of the opinion that there's a case to be made that there
simply is no positive way to jam more than a few dozen thousand people onto a
single site, into a single community, with the hyperconnectivity implied by
modern technology, without a problem. At some point, the community as a whole
is going to have to take some stances on some things, and society can't afford
to have those stances be writ so large.

The governnment may very well come along someday and break up Facebook, but I
bet it would be broken up into three or four pieces or something. I'd say it
should be _shattered_ , though. A new BabyBook shouldn't have more than
50,000-100,000 users in it. And that's still a single community starting out
at what may very well be already the maximum size.

Of all the places, it's the hardest sell here, because this _is_ where
Facebook's ideology comes from and it's hard for a lot of HN denizens to see
much light of day between truth and Facebook's politics, but abstractly,
there's no compelling reason to believe that everything labeled as bad and
wrong by Facebook actually _is_ wrong. The odds that all "conspiracy theories"
are false approaches zero, honestly. The availability heuristic will bring the
most obviously false ones to mind like "flat earth" and "moon landing hoax",
but the HN gestalt believes plenty of conspiracy theories like companies
conspiring to hold prices down or blocking research that shows negative side
effects for pharmaceuticals, or honestly I could go on for quite a while here;
there are many other things of a similar level of believability that are
either getting blocked by Facebook now, or where the frontier of Facebook
censorship is about to reach at its current pace, that simply don't flatter
Facebook's political positions and choices.

I think it ought to be OK for Facebook to make those decisions, because I'm
serious about communities having to make calls about what it's going to
accept. There's no way around it, it's as inevitable with community growth as
gravity pulling a large body into a spherical shape. What's wrong with that is
that we have small numbers of managers at Facebook making these decisions for
the entire Internet, inevitably inflaming everything in the process. It should
be a more distributed process.

As for those who hope that there's some way to leverage Facebook's
concentrated power to simply eliminate all the bad ideas, even if we stipulate
that Facebook's aribtration of truth is in fact totally accurate and totally
unbiased, a rather astonishing and frankly unbelievable accomplishment for an
advertising company, you need to give that idea up now. History tends to show
this sort of suppression just energizes those movements. Rather than
deplatforming them by trying to deny them access to the Facebook of today, you
need to deplatform them by destroying the entire platform that has that reach
in the first place. You can't get rid of them, if for no other reason than
there's a baseline of literal mental illness that isn't going away any time
soon. You can't prevent them from speaking out. But you can make it so that
they're in a corner of the Internet, because _everyone 's_ in some corner of
the Internet somewhere. We kinda had that in the 200x's and late 1990s. It
mostly worked.

~~~
theNJR
< there simply is no positive way to jam more than a few dozen thousand people
onto a single site

Agreed. The trick is, how do you leverage, or get around, network effects
without destroying the community? Network effects create the problem, but are
built in to the medium itself.

< But you can make it so that they're in a corner of the Internet, because
everyone's in some corner of the Internet somewhere. We kinda had that in the
200x's and late 1990s. It mostly worked.

Youth is always sunny, but those days certainly seemed better. There was a
meta-community of people just wanting to grow the internet, so everyone at
least had that shared goal in common. I get that feeling here on HN for the
most part.

~~~
zrm
> The trick is, how do you leverage, or get around, network effects without
> destroying the community?

Federation. Give the network effect to everybody even though each community is
independently operated.

------
jasode
_> It’s promoting diversity in its truest sense – a broad perspective on
shared challenges. [...] To let three online platforms decide what’s real and
who gets to speak, will almost certainly result in nothing good, or new._

Too many essays about "free speech" focus on stating principles that _feel
good to read and agree with_. However, the more difficult analysis is how to
_implement_ a true free speech platform.

That's the puzzle nobody talks about: how to make a uncensorable free speech
platform that's _financially sustainable_.

If one does Ctrl+F in Robinson's essay to search for _" advertisers"_ or _"
sponsors"_, those terms are not mentioned once. Youtube/Facebook/Twitter are
funded by advertisers. Youtube isn't the arbiter of diverse topics, it's the
advertisers. And indirectly, it's the mainstream viewers that arbitrate what's
viewable because they threaten to boycott the advertisers. (Previous comment
about this.[1])

If big advertisers like Proctor & Gamble, Coca Cola, etc want to stop paying
for ads because the platforms host controversial topics, how does this utopian
vision of free speech get funded? That never seems to be discussed.

Some alternative funding possibilities:

\- pass a law that requires advertisers to pay for unsavory content (e.g.
including Nazi, Alt-Right, beheadings, etc.) However, this type of radical law
seems impossible to pass.

\- create a new communications platform that's funded by the government (e.g.
the USA government?) that allows anything except for child pornography. This
just shifts the problem and whatever government that runs the site will
eventually start censoring particular topics they don't like. Also, if the
website does not censor, it will devolve into a cesspool and mainstream
audiences won't bother to log into it. This is an example "freedom of speech"
eating its own tail because the big audiences voluntarily shun it.

\- a uncensorable blockchain publishing platform. I see no realistic projects
that non-techies want to use. Mastodon instances are run (and funded) by
volunteers and by its nature, it won't be the true free speech platform people
are looking for.

Let's discuss the challenge of _realistic implementations_ instead of
repeating the same complaints about popular websites closing off topics. They
are beholden to advertisers and therefore making editorial decisions to
maintain a viable business. Complaining about content filtering decisions
driven by financial self-preservation isn't going to solve the problem.

[1]
[https://news.ycombinator.com/item?id=18372005](https://news.ycombinator.com/item?id=18372005)

~~~
pjc50
Is your uncensorable platform going to publish all of (a) child porn (b) calls
for genocide (c) US state secrets (d) Iranian propaganda? If you _are_ going
to ban the CP, who is doing to do that, how will they be paid, and how will
you handle their PTSD?

~~~
AnthonyMouse
> If you _are_ going to ban the CP, who is doing to do that, how will they be
> paid, and how will you handle their PTSD?

We have always had a solution to this. The party responsible for identifying
the Really Bad Stuff is the government, the party who goes to jail is the user
who posted it, and the extent of the platform's responsibility should be to
remove the content in response to a court order (which the uploader would have
the right to argue their case against).

This makes removing content expensive -- it requires litigation. This is on
purpose. It then satisfies the concern with CP, because that is serious enough
for the government to expend the resources to get the court order. But it
serves as a bottleneck to casual censorship, and it removes the responsibility
for making legality determinations from the intermediaries who are totally
unqualified to be making them.

~~~
pjc50
You might want to check what "strict liability" means for the relevant
offenses. Certainly in the UK a hoster who didn't immediately choose to take
down CP would not find a court order, they'd be raided and prosecuted.

~~~
AnthonyMouse
Which is exactly the problem. Bad laws make it so we can't have nice things.

~~~
pjc50
Well, they also make it so you can't have some really horrible things, and a
bit less likely that children will have horrible things done to them.

~~~
AnthonyMouse
> Well, they also make it so you can't have some really horrible things

The really horrible things can and do happen regardless of whether you make
platforms liable for user content.

> and a bit less likely that children will have horrible things done to them.

There is no real evidence that is true and several reasons to expect that it
isn't, in much the same ways that SESTA made life more difficult and dangerous
for sex workers.

