
Why do you host offensive content? - benburwell
https://www.nearlyfreespeech.net/about/faq#TheLongGame
======
falcolas
Honest, but leading, question in return: Offensive to whom? 90% of the
population? Wouldn't that block out the Church of Satan's distributed
materials (that is, coloring books)?

To 80% of the population? Wouldn't that be the Book of Mormon?

To 60%? The Quran?

I think this part of their answer is a great response: "Of course, the
simplest reason is that it's not up to us to decide what the rest of the world
should or shouldn't see. Bad news, it's not up to you either. Worse news, it's
still true even when we agree. Which is probably most of the time."

~~~
stcredzero
Check out their MFFAM policy. Search for, "What is the "MFFAM" policy?"

~~~
smsm42
The policy is a bit of a misnomer though - as I understand, that's them
funding the fight - by donating the money that would otherwise be their profit
- not the "morons". But I think that's a very acceptable way to handle the
problem.

------
waynecochran
If it is offensive because it stems from "bad ideas" then it is imperative
that it is posted in the "public market of ideas" so it can be exposed and
refuted.

If it is not posted then folks will not know how to refute it when it crops up
again -- and it will.

Free Speech for bad ideas is as important as free speech for good ideas.

Don't be offended. That's your choice.

P.S. This doesn't mean that everything offensive must be posted -- there is
stuff that should be illegal to post because of the harm it can cause.

Edit: downvote within 15 seconds of posting ... you are speedy in your
thoughtlessness.

~~~
UncleMeat
We've known how to refute genocidal racism for ages. The ideas aren't new and
the reasons why they are garbage aren't new. Yet these ideas still exist and
still spread. Why has the market of ideas not destroyed these ideas? If the
market cannot destroy even the most terrible ideas that humans have ever
devised, what does that really say about the market?

~~~
dlivingston
And who is to be the censor? Would you be so complacent if I were the censor,
dictating what you can and cannot read, see, and hear?

~~~
root_axis
The censor is the owner of the platform broadcasting the content. If I didn't
like your censorship practices then I wouldn't use your platform.

~~~
dlivingston
Just to be clear, are you advocating for censorship on a platform (i.e.
Facebook) level, or a governmental level as well?

~~~
root_axis
I am not "advocating for censorship", I am "advocating" for the freedom to
determine what content is hosted on a computer you own.

A computer owned by the government is a different story since government
property is paid for by the tax-payer, so it should not be able to act
unilaterally in matters of removing content.

~~~
waterhouse
And, out of curiosity, what would you say if someone made an argument that
company X, while not owned by the government, is effectively a monopoly _due
in significant part to government intervention_? Mechanisms for this might be:
patent grants, regulations that create significant barriers to entry for new
competitors, lucrative contracts with a federal department, inconsistent
enforcement of existing laws while the incumbent has protective connections
with the enforcers, tariffs...

~~~
root_axis
Your hypothetical is too vague for me to answer. If you have a specific
example I'd be happy to elucidate further.

I'll add that I don't regard patents, regulation, or "inconsistent enforcement
of existing laws" as a reason why a company or individual should forfeit the
freedom to determine what they host on a computer they own.

------
gowld
The difference between something like NearlyFreeSpeech and something like
YouTube, which is often lost in the noise, is that NearlyFreeSpeech is a
hosting provider, while YouTube has, in addition, also an active editorial
system and recommendation engine that _promotes_ "offensive" content. Just as
NFS is justified in hosting content and requiring uploaders to back it up with
their real identity, so that those uploaders can be judged, people are
justified in judging YouTube management for the content that YouTube
management uploads (the Watch Next side-bar, and the comment streams it
attaches to videos)

~~~
stcredzero
_people are justified in judging YouTube management for the content that
YouTube management uploads_

You mean "promotes." People are free to judge. However, by granting
discovery/virality selectively, rather than going by pure interest and
numbers, YouTube is exercising editorial judgement. This makes them into a
publisher, not a platform. People are free to urge YouTube to become a
publisher. YouTube is free to follow suit or not and take the rewards and
consequences of their actions.

~~~
root_axis
> _This makes them into a publisher, not a platform._

Says who? "Publisher" and "Platform" are not mutually exclusive identities.

------
wheelerwj
Nearlyfreespeech.net is a great dns and hosting solution for simple sites and
applications. They got a nice bump back in when godaddy got mixed up in SOPA
back in 2012.

I've used them before and they offer a great service for a good price and I
support their general philosophy in regards to privacy and free speech.

~~~
Chirael
I went to bookmark this site as a hosting company supporting a ton of
languages, only to see that I’d already bookmarked it back in 2006. So I guess
they’ve been around for a while :)

------
root_axis
> _censorship is always bad_

This isn't true. A few examples.

If I post your naked photos online and they are censored, that isn't bad.

If I post your address online next to a photo of your house and it is
censored, that isn't bad.

If I post the source code of your personal project online and it is censored,
that isn't bad.

If I post the contents of your diary online and it is censored, that isn't
bad.

If I post the contents of a heated argument between you and your spouse online
and it is censored, that isn't bad.

If I post a photoshopped picture of your kid online and it is censored, that
isn't bad.

Not everything deserves to see the light of day and actually, we _do_ get to
make that decision. This idea that "free speech" means everyone has to agree
to let everything appear on the internet is false. "Free speech" also means "I
have the freedom not to support someone else's speech".

~~~
stcredzero
_This isn 't true. A few examples._

If you have to resort to extreme examples, then it shows the weakness of your
position. Those examples are mostly illegal. It's not "censorship" in the
context of Free Speech if one is counteracting illegal activity. The serious
societal problems come in when there is censorship on an ideological basis.

 _This idea that "free speech" means everyone has to agree to let everything
appear on the internet is false._

If Free Speech applies to the Internet, then it means precisely that everyone
has to agree to let everything appear on the Internet. In 2019, saying that
people can have Free Speech, just not on the Internet, is like saying people
can have Free Speech, just not with mechanized printing. In 2019, publishing
has to include the Internet, and suppressing publishing on an ideological
basis is suppressing the principle of Free Speech.

~~~
root_axis
> _If you have to resort to extreme examples, then it shows the weakness of
> your position_

What is extreme about them? An example of an extreme would be "child porn" or
"your credit card number" or "the password to your email". The examples I
listed are pretty mundane and actually quite common. Either way, you haven't
actually explained why any of the examples I listed would be "bad".

> _Those examples are mostly illegal_

None of those examples are illegal except the naked photos and not even that
in all states (but most, and not even just "naked photos" in and of themselves
necessarily, i.e. naked photos in the context of "revenge porn"). But even if
they were, so what? If it's illegal does that mean it's not censorship to
remove it?

> _If Free Speech applies to the Internet, then it means precisely that
> everyone has to agree to let everything appear on the Internet_

This is obviously wrong. Based on that logic you should never be able to
delete a comment from your personal blog because you're censoring the critics.

~~~
stcredzero
_What is extreme about them?_

Many of them are illegal.

 _An example of an extreme would be "child porn" or "your credit card number"
or "the password to your email"._

Also illegal.

 _None of those examples are illegal_

Most of those examples would constitute evidence of illegal activity.

 _If it 's illegal does that mean it's not censorship to remove it?_

If it's illegal, then it's no longer protected by the principle of Free
Speech. The issue isn't censorship. It's the principle of Free Speech.

 _Based on that logic you should never be able to delete a comment from your
personal blog because you 're censoring the critics._

No. Ethically, that would be wrong. So too, would not taking down a dox or an
illegally obtained naked photo. Neither taking down a dox or an illegally
obtained naked photo would be examples of the suppression of Free Speech.

Basically, you're engaging in the dishonest conceit of equating Free Speech
and censorship. They are not the same thing. At issue is the subset of
censorship which abrogates Free Speech.

~~~
root_axis
> _Many of them are illegal._

No they aren't. Point me to a law stating that any of those are illegal
besides the "revenge porn" example that I outlined.

> _If it 's illegal, then it's no longer protected by the principle of Free
> Speech_

So are you saying that anything the government deems illegal doesn't fall
under the "principle of Free Speech"?

> _No. Ethically, that would be wrong_

Sorry, I'm not understand what you mean here. Are you saying it would be
ethically wrong to delete comments from your personal blog? I'm not being
snarky, just not sure if you're referring to something else when you say
"ethically wrong".

~~~
stcredzero
_> Many of them are illegal._

 _No they aren 't. Point me to a law stating that any of those are illegal
besides the "revenge porn" example that I outlined._

That's dishonest quoting. The last quote should include my assertion that many
of those would be evidence of illegal activity.

 _If I post the source code of your personal project online_

If that was done without permission, if that information was obtained
illegally, then that would be illegal. This would indeed be the case with my
personal project under its current copyright, licensing terms, and repository
disposition. That such a circumstance could be illegal is part of the intended
shock value of your extreme example. If you didn't intend such an illegal
scenario, it's still reprehensible and extreme. (see below)

 _If I post the contents of your diary online_

Again, if that was done without permission, if that information was obtained
illegally, then that would be illegal. Again, if you didn't intend such an
illegal scenario, it's still reprehensible and extreme. (see below)

 _If I post the contents of a heated argument between you and your spouse
online_

If the contents were recorded from a private conversation in a home here in
California, that information was obtained illegally, then that would be
illegal. Again, if you didn't intend such an illegal scenario, it's still
reprehensible and extreme. (see below)

The following two aren't illegal, but they're just morally reprehensible in a
way which even transcends ideology. As such, the following examples are
certainly "extreme."

Doxxing: _If I post your address online next to a photo of your house_

 _If I post a photoshopped picture of your kid online_

But in any case, your position isn't defensible, because you're dishonestly or
mistakenly conflating censorship and the principle of Free Speech. They are
not equivalent. Again, the issue is the subset of censorship which interferes
with Free Speech.

To be fair, there could be scenarios crafted for all of your examples of
concerning activity which would make them Free Speech. If the information had
a purpose to further the transparency of organizations, public figures, or
shed light on legal matters, those would be covered by Free Speech. On the
other hand, if the purpose is purely to hurt or humiliate someone for views,
then this is reprehensible, and it doesn't fit the purpose of Free Speech.

~~~
root_axis
> _That 's dishonest quoting. The last quote should include my assertion that
> many of those would be evidence of illegal activity._

It's not dishonest, it's literally your exact quote, copy and pasted in whole,
verbatim. Your "evidence of illegal activity comment" is written later in the
post, and I didn't respond to it because it's completely irrelevant. You
stated those things are illegal. I explained that they're not and asked you to
point to some evidence that they are. The fact that you are now saying they're
"evidence of illegal activity" is just you moving the goalpoasts.

 _Anything_ can be considered "evidence of illegal activity" _IF_ it
implicates one in a crime. Using your reasoning, a bloody knife is illegal
because it's "evidence of illegal activity". Well... no, it could just as
easily be evidence that I cut myself making dinner, so there is no reason for
you to bring that up except to shoehorn my examples into the category of
illegal activity even though they aren't actually illegal.

> _if that was done without permission... if that information was obtained
> illegally... If the contents were recorded from a private conversation in a
> home here in California..._

So only IF you qualify everything I wrote with examples I didn't use which are
_actually_ illegal. You've got some balls to lecture others on dishonest
argumentation when you can't even honestly tackle the argument as I wrote it.
Unless you decide to post some links to laws showing that those examples are
illegal, I am just going to move on from this part of the discussion because
you're objectively wrong here. They're NOT illegal.

> _The following two aren 't illegal, but they're just morally reprehensible
> in a way which even transcends ideology_

So are you suggesting that things you deem as morally reprehensible should be
exempt from your "principle of Free Speech?"

You also did not answer my question about whether or not it is "ethically
wrong" to delete comments from your personal blog.

~~~
stcredzero
_Using your reasoning, a bloody knife is illegal because it 's "evidence of
illegal activity"._

 _...You 've got some balls to lecture others on dishonest argumentation when
you can't even honestly tackle the argument as I wrote it._

One can cite or represent a bloody knife for shock value. In such cases, the
implication is often clear. I suspect you're either misleading with the
shocking implication, thereby having it both ways.

Perhaps I misread your intention. However, your scenarios don't really make
sense if one applies the "not a big deal" interpretations. Sure, there are
situations where it's not a big deal to post a diary entry. In that case, why
would anyone care and why would there be any censorship which would be
considered "good?" It doesn't make sense.

 _So are you suggesting that things you deem as morally reprehensible should
be exempt from your "principle of Free Speech?"_

There is indeed a problem with speech devoid of principle, meant only to hurt
someone. This is understood by the law. The purpose of Free Speech is to let
people express grievances or objects with regards to principles. Morally
reprehensible speech should be allowed, but it's not the purpose of Free
Speech. Some subset of morally reprehensible speech would even be illegal, and
therefore it wouldn't be protected as Free Speech.

Again, it's you who brought up the nebulous extreme examples in the first
place, with the purpose of having it both ways to justify censorship.

~~~
root_axis
> _One can cite or represent a bloody knife for shock value. In such cases,
> the implication is often clear. I suspect you 're either misleading with the
> shocking implication, thereby having it both ways._

I am not making any implications. My point here is very clear: your statement
that the scenarios I described are illegal is false. It's as simple as that.
Either way, it doesn't matter to the point I'm making.

> _your scenarios don 't really make sense if one applies the "not a big deal"
> interpretations_

Whether you regard the scenarios as "extreme" is a subjective characterization
that has no impact on the argument. I am arguing that censorship isn't
inherently bad or wrong. I don't agree that the examples are extreme or
illegal, but even if they are, they are still examples of content that could
inadvertently end up on the internet; the point is the same no matter how it
got there.

Even if someone broke into the house of a politician, assaulted them at
gunpoint, and then stole sensitive political documents before posting them on
the internet, the argument is the same. The voluntary censorship of those
materials by platform owners would not necessarily be bad or wrong. In that
extreme and illegal example, platforms might have a legal obligation to censor
the materials as well, but that doesn't change whether or not doing so is bad
or wrong.

> _There is indeed a problem with speech devoid of principle, meant only to
> hurt someone. This is understood by the law. The purpose of Free Speech is
> to let people express grievances or objects with regards to principles.
> Morally reprehensible speech should be allowed, but it 's not the purpose of
> Free Speech. Some subset of morally reprehensible speech would even be
> illegal, and therefore it wouldn't be protected as Free Speech._

Descriptions like "devoid of principle and meant only to hurt someone" is a
subjective determination the likes of which sit at the heart of every free
speech debate. As you already stated, in the context of "what is understood by
the law" the point is moot since "illegal speech" is not "free speech" by
definition unless you're arguing that the law is wrong or misapplied. Either
way, speech that is "devoid of principle and meant only to hurt someone" is
also legally protected speech in most cases.

> _Again, it 's you who brought up the nebulous extreme examples in the first
> place, with the purpose of having it both ways to justify censorship._

I did not "justify censorship", I refuted the statement "censorship is always
bad". To characterize that as justifying censorship is dishonest.

------
lliamander
I tend to look at it from the perspective of the listener. Do I, as a
individual adult, have the freedom and responsibility to make up my own mind
about other people's ideas or not?

I believe I do, and so I think there should be a platform for free speech. I
believe this even though I largely do not wish to consume much of what people
might term offensive.

I appreciate platforms which curate and moderate content as a form of customer
service. What I don't appreciate is entities (governments or corporations)
taking a moralistic stance as if it is their duty to stamp out bad ideas from
existence.

------
ecoled_ame
The user takes total responsibility for the address he or she types in the
navigation bar. They navigate there from an infinite distance.

------
jchw
My favorite bit here is not the content policy, it’s the MFFAM bit, which is
clever and amusing.

[https://www.nearlyfreespeech.net/about/faq#BecauseFuckNazisT...](https://www.nearlyfreespeech.net/about/faq#BecauseFuckNazisThatsWhy)

~~~
stcredzero
The SPLC has done good work in the past, but they've published outright
falsehoods recently. They've even had to retract, apologize, and pay a legal
settlement. Much of their activity of late seems to be about political
expedience and tribe, less about principle.

------
duxup
Do they have a policy where they deal with threats of violence / organizing
something like that?

I couldn't find any, but I'm also not familiar with the site.

There's "offensive" as in thinks I don't like, or even hateful statements ...
but to me threats of violence and etc fall into another area.

~~~
dmitrygr
At least in the United States, threats of violence are also protected speech.
"Fighting words" are not, but that is generally taken to mean words intending
to cause immediate direct and specific harm to a specific person. A generic
sort of "I wish $person was dead" is protected.

This sort of thing came up a long time ago:

[https://en.wikipedia.org/wiki/National_Socialist_Party_of_Am...](https://en.wikipedia.org/wiki/National_Socialist_Party_of_America_v._Village_of_Skokie)

~~~
duxup
I'm thinking of the specific type.

------
parliament32
Reddit could take a page from their book.

------
jasonvorhe
If you allow everything that isn't illegal, you're actively supporting Nazis,
anti-Semitism, racism, anti-LGBTQI content which makes you an accessory.

It's censorship if a government suppresses information, if a company decides
not to business with you, it's contract law.

This isn't that difficult to grasp.

~~~
AQuantized
I think I would disagree with your definition of censorship. If almost all
entities in a given sphere, be it web hosting or media, decided they would
absolutely not publish or host anything related to a particular topic or form
of content, it seems accurate to term that censorship. The fact that it isn't
done by the government doesn't mean that particular content isn't functionally
suppressed.

------
gwbas1c
Over the last few years I've been questioning what it means to have freedom of
speech. This quote in particular strikes me:

> Finally, censorship is always bad, for a variety of well understood reasons
> that we don't need to repeat here. But in the case of some types of content,
> it has special dangers. When you censor a web site based on the extreme or
> dangerous views of its creator(s), you haven't stopped those people from
> thinking that way.

Why? The problem is that I've seen someone who was very close to me repost
propaganda on Facebook that looks like it's following the Nazi propaganda
playbook. Stuff against immigrants, against religious minorities, ect. Just
take some classic Nazi propaganda, swap out "jew", and that's this person
reposts.

(Or used to, as this person recently complained that Facebook is blocking
their posts.)

Anyway, I don't think that this person really thinks this way; instead I think
this person's thinking is manipulated to push a political agenda.

------
ketzo
I see this argument so, so frequently -- that repugnant views _need_ to be
given a platform, that heinous and disgusting content _must_ be allowed a
space, so that everyone can see it and fight against it!

Since when the fuck is that how the internet works?

If Stormfront hosts a site on NFS.net, who do you think visits that site?
Bright young progressives valiantly carrying a banner of social justice?

No. Fucking _neo-Nazis_ visit the Stormfront website, because, and this is
important, _ _it 's a platform for fucking neo-Nazis__.

Christchurch. Charlottesville. Numerous terrorists have indicated very clearly
that they were radicalized online. Why the fuck is it somehow your
responsibility to provide these people a platform to spread their poison?

To end my rant, here's that ridiculous quote that always gets tossed around in
these discussions:

> "Sunlight is said to be the best of disinfectants; electric light the most
> efficient policeman."

Actually, as it happens, the best disinfectant is a harsh chemical, and the
most efficient policeman is a _fucking policeman._ The internet is not a place
of light and exposure. It is a place where disgusting ideologies can hide,
quietly attract followers, and conspire to _murder people._

Consider how powerful that sunlight was the next time a right-wing terrorist
screams about blood and soil while brandishing an AR.

I'm sorry the tone of this is so angry. NFS is a great service -- I just can't
stand this pitiful justification for aiding radicalization and eventual
violence. There _is_ a line you can draw. It is up to you to draw it.

~~~
dlivingston
Where do you, personally, define the line though? And how do you ensure that
line won't be changed to fight against you once your enemies are in power?

~~~
ketzo
I could give you my answer, but it doesn't really matter (and anyway, I think
it's kinda clear from my earlier comment) because I don't run a platform that
could host any offensive content. NFS.net does -- and so I believe it's their
responsibility to draw a line. It's a very, very hard problem, and there
simply isn't one right answer, but that doesn't mean we shouldn't give one at
all.

You can't ensure that! At all! Which sucks! But that doesn't mean we should
wring our hands and worry about "what if?" \-- in fact, it only makes it MORE
important that we resoundingly reject ideologies that would seek to abuse a
line for nefarious, censorship-y purposes.

Maybe we make that part of our line...

------
Nursie
> "Finally, censorship is always bad, for a variety of well understood reasons
> that we don't need to repeat here. But in the case of some types of content,
> it has special dangers. When you censor a web site based on the extreme or
> dangerous views of its creator(s), you haven't stopped those people from
> thinking that way. You haven't made them go away. You certainly haven't
> stopped the people who hold those views from doing whatever else they do
> when they're not posting on the Internet. What you've actually done is given
> yourself a false sense of accomplishment by closing your eyes, clapping your
> hands over your ears, and yelling "Lalala! I can't hear you!" at the top of
> your voice. Pretending a problem doesn't exist is not only not a solution,
> it makes real solutions harder to reach."

I no longer believe this, when cesspits of alt-right, racist assholes use such
grandiose ideals to spread their hatred, which then bubbles out into the real
world.

The idea that good ideas will win, and that common sense and rationality will
take the day, are not really supported by what we see around the net. Instead
the greater internet fuckwad theorem holds more true, and the spread of vile,
violent ideologies is enabled.

Freedom of speech is a protection from government, but I think those providing
speech platforms, such as hosting companies, should probably take more
responsibility for what they propagate.

~~~
Frondo
I used to be much more of a free-speech absolutist. It wasn't til I started
reading about how genocides start that that started to change for me.

There is a pretty direct line drawn between speech and violence against out-
groups; anyone who says otherwise would do well to read about the Rwandan
genocide and their 200 days of public radio broadcasting demonizing the Tutsis
prior to the genocide itself.

After all, if there wasn't power in speech, none of this would matter; there
would be no restrictions on speech anywhere if it didn't threaten someone.

Even in free-speech absolutists, there's often agreement that direct
incitements to violence should be off-limits, and why? Because speech moves
people to act.

I haven't often seen the position that incitements to violence should also be
protected speech, though I'm sure those people are there -- the question to
them for me would be, what are you trying to advance or protect against with
that position?

The question I'd also ask is: if you want to say that speech such as calls to
genocide should also be protected, how is that advancing society, especially
for the targets of that? The marketplace of ideas doesn't seem to do a good
job protecting them, so...what's the solution there?

~~~
rthrowayay
Controlling speech in order to prevent violence does not work when the people
carrying out the violence control the institutions. In your Rwandan example
the Rwandan government could not be trusted to censor their own goals. The
situations where this works are much rather and smaller because it is about
preventing violence that doesn't have major backing.

