In general, I don't think it's a good idea for the government to decide what I can post on social media or not. There are already laws against defamation and "volksverhetzung" (sry I don't no the translation, basically it means I can't say "kill all the jews/muslims, they are filthy swines"), so keep the internet free from more regulation on speech. The internet is one of the last places where people have some sort of freedom of opinion/listening to different opinions, and our lawmakers make it more and more unfree.
The minister responsible at some point made the completely idiotic remark that that is the right way to go because Facebook and the like are making a lot of money, so they also should be responsible for these costs. It seems like you can be a minister in Germany and never have heard of this concept called "taxes" that other countries use to finance an independent judicial system.
The actual effects of this law in terms of possibly useful message lost are minuscule compared to the real culprits of goverment-enforced speech restrictions: namely copyrights ( and extras like DMCA) & libel laws. Those actually matter.
It is a decision made to limit your constitutional freedoms due to legal obligations towards the public. That is very much the definition of a criminal case.
> People don't end up losing jobs, going behind bars, gaining criminal records.
Parking in the wrong spot also doesn't lead to any of that, but it's still a criminal case.
> They simply cannot spread hate-speech.
So, Titanic was spreading hate speech then?
> The actual effects of this law in terms of possibly useful message lost are minuscule compared to the real culprits of goverment-enforced speech restrictions: namely copyrights ( and extras like DMCA) & libel laws. Those actually matter.
Which is relevant to whether this is a good law how?
It actually isn't. PArking ticket and the like are administrative violations, not criminal. Admittedly most people are unaware of this distinction; unfortunately that lack of awareness is exploited by, for example, politicians and pundits who describe all undocumented immigrants as criminals, when in fact many of them are in administrative violation and might be deportable but are not guilty of criminal acts or subject to criminal penalties.
Against who? I have a feeling I'm going to roll my eyes at the answer, though.
Which is, as I pointed out elsewhere, a subcategory of criminal law int he broad sense (as opposed to civil law). Yes, Ordnungswidrigkeiten are not Straftaten, but that is not what this discussion is about. Ordnungswidrigkeitenverfahren have the exact same constitutional limitations as Strafverfahren.
> Against who? I have a feeling I'm going to roll my eyes at the answer, though.
Maybe ask someone who claimed that the NetzDG only limits hate speech?
Just because your Facebook post is deleted does not mean your constitional right to free speech is limited.
Nobody has articulated that this restriction is harmful; nor protected; and certainly not why. By contrast, the harms of the messages limited thusly are rather more clear.
Furthermore, you're trying to draw some really broad analogy, and find that facebook is the government and thus that constitutional protections apply to enforcement by facebook. I mean... that's murky at best - and note that even if this were direct government censureship it's not even clear this would be a bad idea or whether the hypothetical governmental corolloray would be constitutionally prohibited (which is particularly unclear since nobody knows exactly what that might be).
Facebook is excercising disgression in which content to publish to whom, and when. They also happen to make a lot of money like that. So regardless of the original authors rights; it's not at all clear that facebook should be immune from prosecution for publishing prohibited content. In a sense, this new law is still letting them off unreasonably lightly.
I think we as societies still need to figure out what the heck the age of data science means for free speech. I don't think being all fundamentalist about that is going to help, in any case.
The very first comment in this thread:
"A couple of comedians and satire magazines have been hit"
Yes, there are limits to the freedom of speech (and technically, there is not even freedom of speech in the German constitution, but rather freedom to express one's opinions). But satire is generally very firmly within the bounds of protected speech, the constitutional court has had to deal with such cases many times, and it has found over and over and over, some of the cases involving just the satirical magazine referred to above, that satire is almost always within the bounds of where the state may not interfere with the freedom to speak. The constitutional court has explained at length the importance of satire and controversial art for political discourse.
So, no, the harm of the messages limited is not clear at all.
> Furthermore, you're trying to draw some really broad analogy, and find that facebook is the government and thus that constitutional protections apply to enforcement by facebook.
No, nothing like that. Facebook is not bound by the constitution. The constitution only binds the state. Facebook may delete satire as much as they like. But the state may not incentivize Facebook to delete satire, because the state is bound by the constitution, and the state is forbidden from restricting satirical speech (for example), and by extension from making other parties restrict satirical speech. It's not Facebook's actions that go against the constitution, it's the state's actions that incentivize Facebook that go against the constitution.
> and note that even if this were direct government censureship it's not even clear this would be a bad idea or whether the hypothetical governmental corolloray would be constitutionally prohibited (which is particularly unclear since nobody knows exactly what that might be).
Well, except that it is. The constitutional court has dealt with that. Restricting satirical speech is an almost total no-go for the German state, the constitutional court will not have that.
> Facebook is excercising disgression in which content to publish to whom, and when.
That's indeed a good point, though on the other hand, they probably will say that they try to act as agents finding useful information for their respective users rather than as publishers selecting what they consider interesting. That's also a lie, of course, but still with some truth to it. Really, what they are doing doesn't really fit directly any traditional roles.
> So regardless of the original authors rights;
But you cannot disregard the constitutional rights of the authors because you want to achieve some other goal, that's the point of constitutional rights.
> I think we as societies still need to figure out what the heck the age of data science means for free speech.
Well, yeah, I guess society still needs to figure it out, but the answer really isn't that difficult:
Communication needs to be freed of surveillance/advertisement needs to be separated from communication.
Probably not really possible, but this fusion of communication technology with advertisement/marketing/manipulation technology is a huge problem and ultimately at the heart of all of this. The need to get people to pay attention to their advertisers corrupts all other actions of Facebook, Twitter, and the whole bunch.
In effect, the state is saying: we'll exempt you from the normal responsibilities you have when you say something, as long as you're just forwarding a message. The change is now that there's a caveat: you must now (sometimes) make reasonable efforts to avoid forwarding illegal content.
Frankly: as long as facebook (e.g.) is more than a dumb pipe this isn't merely a question of getting the laws right; it's intrinsic. You cant exercise editorial control (which they do, even if it's sometimes algorithmic) without being responsible for what you decide to publish.
So we get to choose: either platforms need to really be dumb pipes (i.e. facebook's entire model is essentially completely illegal), or they need to take responsibility for what they say, even if its forwarded content.
I think we'd have much more constructive speech online if platforms were forced to make such a choice: to the extent that they filter, they're responsible for what they let through.
I am not. I am talking about incentives.
> In effect, the state is saying: we'll exempt you from the normal responsibilities you have when you say something, as long as you're just forwarding a message. The change is now that there's a caveat: you must now (sometimes) make reasonable efforts to avoid forwarding illegal content.
As far as the constitution is concerned, there are no "normal responsibilities". The normal responsibility is for the state to not interfere with speech, everything beyond that is subject to the boundaries set by the constitution, not the other way around.
> Frankly: as long as facebook (e.g.) is more than a dumb pipe this isn't merely a question of getting the laws right; it's intrinsic. You cant exercise editorial control (which they do, even if it's sometimes algorithmic) without being responsible for what you decide to publish.
Either it is intrinsic or you have to get the laws right. Laws are what defines responsibilities, so I would say it's pretty obviously not intrinsic.
> So we get to choose: either platforms need to really be dumb pipes (i.e. facebook's entire model is essentially completely illegal), or they need to take responsibility for what they say, even if its forwarded content.
... or they get to exercise editorial control of sorts without being held responsible.
I mean, there might be arguments in favor of that approach, but just claiming that it's so isn't one. Also, I think it's really unhelpful to insist on a binary distinction between "publisher" and "transport service". A newspaper was a publisher. The post service was a transport service. A Facebook really just is neither, and there is no reason why everything has to fit into one of these two categories. A car is not either new type of horse or a new type of factory. It's just a car.
> I think we'd have much more constructive speech online if platforms were forced to make such a choice: to the extent that they filter, they're responsible for what they let through.
I'm not so sure, I suspect that just leads to two extremes with nothing in between: Completely unmoderated exchanges where trolls reign and strictly regulated ones where nothing remotely controversial can be said.
Rather I would think that deletion/blocking should be required only when ordered by a judge (or possibly a sort of low-level judge for preliminary decisions on such matters), where you have legal options to defend your right to speak, and where the judge has no incentive to decide one way or the other. So, there could be a penalty for when Facebook doesn't follow the order, but not for when they don't recognize something illegal as illegal.
As to the intrinsic responsibilities of editorial control: Sure: legal obligations are what you define them to be. But escaping legal consequences doesn't mean you're not conceptually responsible; it simply means the law isn't perfect. And of course there are lots of ways you can choose to distribute responsibilities. So perhaps I should of said: I don't see a way for the considerable influence that media platforms have to be held accountable except if they are also responsible. The obvious alternatives are that that power is completely unchecked, which is harmful, or that somebody else dictates what they can do - which I kind of doubt will work in the first place, and even where it does, that means you better have a lot of faith in those controllers. Simpler and more reasonable for the media platforms to actually be responsible for what they say, and not maintain the fiction that they have no influence or effect on the messages they "transmit".
As to requiring a judge to get involved before censorship: I think that's actually worse than complete immunity. Involving a judge makes it appear as if there is control, and as if media platforms are held to some kind of unbiased or at least fair standard. But the whole point of a modern media platform is that you don't need manual control; and indeeed you can never keep up with manual control. Algorithms will win against judges every single time, simply because no society can have that many judges (not to mention even judges aren't a panacea: fairness isn't trivial). Judicial interventions will necessarily be only in such tiny fractions as to be meaningless - and even if punishment and enforcement were scaled to absurd levels (with their own costs), well, media platforms are international behemoths. A judge may well rule some post to be illegal, but many posters will be beyond the judges reach.
If media platforms get to exercise editorial control of sorts without being held responsible, then they will influence the debate in ways that undermine the very reason for freedom of speech to exist in the first place. I kind of expect that outcome is what will actually occur, and the consequences in the long run are hard to predict, but I'm guessing: dire.
The incentives for a media platform are all wrong. They want to push messages that encourage using the platform; and as such they're going to be enablers for needless tribalism, short, poorly-thought-out posts, for anybody who is willing to pay to influence regardless of the merits, for trying to shield users from being confronted with evidence that they're wrong, and more.
If you will: I think people have started to accept that the idea that metadata is somehow something completely different from data is false. Metadata is data, and similarly; the meta-control a media platform has over messaging is really huge. If anything: the fact that they don't actively edit the contents of the messages only grants them more power, because it makes the control they do have opaque and hard to judge the impact of. I don't mean to imply they're intentionally Machiavellian or whatever here: it's just that indirect, hard-to-observe control is really hard to value.
So, to go off on a tangent....
I sometimes wonder what the historical figures involved in the evolution of modern freedom of speech would think if they saw what actually happened. For one; we've grown to utterly fetishize the concept to the extent that rational thought has little to do with it. But simultaneously, it's like we're walking around with blinders on: we don't have freedom of speech, and never will be able to get it absolutely, and the idea that that would work or was the idea some long time past is... well, not exactly founded in anything.
And then, of course, there's the fact that obviously times change, technology changes, society changes, and it's pretty crazy to think that concepts like freedom of speech are somehow impervious to all of that. What worked hundreds of years ago may actually need some tuning. Yet that idea isn't exactly well received in my experience, at least not in the US: thou shalt not question The Founders.
Ironically, there have been some pretty major changes to how we use freedom of speech, and in ways that makes me feel like it's gotten worse. As if we're playing telephone/chinese whispers, and people are going through the motions even if the motivation to do so would seem obviously lacking to those in the beginning of the chain.
There have been obvious changes such as copyright evolution; but also other changes; e.g. lobbying was for a time seen as fraud; not protected speech.
So on the one hand we cry about the inviolability of free speech; yet on the other we've changed what that means repeatedly, and the world has changed too. I don't think that's a healthy mix.
Social media (and more broadly the data-science era) are still young. So far, the way they have affected public debate is probably still relatively random. There may have been been damage, but it's probably rarely been malicious. So whether you support the status quo - social media free from control and any obligations to the public - or a more constrained media somewhat depends on whether you think that's going to stay that way.
I'm not betting on it.
But what is "conceptual responsibility"? The thing is, responsibility is not really an aspect of reality, it is almost completely a social convention. Reality only shows you causal connections, and causal connections certainly are an important factor influencing how we distribute responsibility, but they do not determine responsibility in any simple way. Power plant operators are among the causes of people dying of electric shock. Does that mean that we consider them responsible? Generally no--because we as a society have decided that people running power plants is an overall benefit to society and we can prevent accidents in other ways and a whole bunch of other reasons.
So, yes, you can have opinions about how responsibilities should be distributed differently, but I don't see any sort of "inherent responsibilities".
> Simpler and more reasonable for the media platforms to actually be responsible for what they say, and not maintain the fiction that they have no influence or effect on the messages they "transmit".
Well, first of all, I would object to the idea that because you have influence, you should necessarily be responsible. Telcos also have a massive influence in principle. But, I would argue for good reasons, we have decided that they are not responsible.
Now, that doesn't mean that I think Facebook should not be held responsible for anything. But the problem that I see is that this kind of regulation does not hold Facebook responsible for what Facebook is saying/doing. It instead holds Facebook responsible for what other people are saying, with an in incentive to prevent speech that would be perfectly legal (and ethically somewhere between fine and really important), while actually missing the goal (or is it even the goal?) of holding Facebook accountable for the power they are exercising.
> But the whole point of a modern media platform is that you don't need manual control; and indeeed you can never keep up with manual control. Algorithms will win against judges every single time, simply because no society can have that many judges
Well, but what does that mean then? It's not obvious how judges could keep up, so let's forget about human rights?
> If anything: the fact that they don't actively edit the contents of the messages only grants them more power, because it makes the control they do have opaque and hard to judge the impact of. I don't mean to imply they're intentionally Machiavellian or whatever here: it's just that indirect, hard-to-observe control is really hard to value.
Yes, I agree. And that is precisely why I think this kind of regulation is dangerous. It causes collateral damage, and it doesn't even address the actual problem. Suppose this actually made hate speech go away. Does that change anything significant about the opaque and hard to judge power of these organizations? Or doesn't it rather just help hide it even better?
If you will, we need to restrict underhanded manipulative speech of Facebook, not hate speech of its users. Or at least, if we restrict hate speech of its users, we shouldn't do it by incentivising Facebook to exercise more underhanded manipulative power.
As to the human rights: there's no point in spending a lot of money deceiving ourselves; better not to pretend and face up to the fact there's a problem, than to delegate it to a legal system that will simply make matters worse. I'm not objecting in principle to a judicial solution; I just don't see how that has a hope in hell of doing more good than harm.
I agree this kind of law is likely to cause collateral damage, but I don't believe the collateral damage is going to be worse than the alternative. But it's clearly not ideal, sure! But the point really should be: what's the alternative?
I don't think there are any simple, politically acceptable alternatives. I mean, you could try to turn em into common carrier's or the like, or break up the companies, but that's all fairly extreme - try getting that past the lobbying.
The German Constitution is pretty long so it'd be really helpful if you could perhaps cite specific articles to provide some context for your assertions.
It is not, it is an administrative offence.
Also, yes, "but with the government" is exactly what makes it a criminal case!?
The private company is a publisher which is being held responsible for what it chooses to publish. Individual citizens have a right to hold and express their views, but not an automatic right to publication. If their views are so odious as to invite criminal sanction, it's entirely reasonable for them to incur extra costs/difficulty of publication.
Now this is where someone usually pipes up with 'but my freedoms!' and that's all very well, but if the freedom they wish to exercise involves proposing limits on the freedom of others (perhaps even by ending their lives) then that person is proposing a zero-sum view of freedom for others and cannot complain about having a zero-sum calculus imposed upon their own political expression.
Popper's paradox of tolerance may be explicable in terms of the boundary between positive and zero-sum games.
> Individual citizens have a right to hold and express their views, but not an automatic right to publication.
Well, yes, that is what that means. They have a right to not be hindered by the state in publishing their opinion (unless hate speech, essentially).
So, what's your point then? No judicial system possibly could, but Facebook can? Or Facebook can't either, so let's just ignore the constitution?
> More than that, it would be a textcase of privatising profits and leaving the costs to pick up for everyone else, while facebook siphoons the profits to tax havens.
Wut? Taxing Facebook more in order to finance an independent judicial system that takes care of the legal cases involving Facebook is a textbook case of privatising profits and externalising costs? I'm not sure I follow!?
Also, I'm not sure it's really exactly correct to say that Facebook is the source of these costs, given that it's citizens who say illegal things. If people meet in a pub to commit Volksverhetzung, is the pub the source of the law enforcement costs?
> Wut? Taxing Facebook more in order to finance an independent judicial system that takes care of the legal cases involving Facebook is a textbook case of privatising profits and externalising costs? I'm not sure I follow!?
Perhaps I missed where you mentioned raising taxes for Facebook as a solution. That's an interesting idea. I see two possible problems with that:
1) Facebook sells ads. Other companies sell ads. How do we decide who pays the higher tax and who not, and make it fair? For example, Google (sans Youtube and the irrelevant G+) is not a social network, but can still spread disinfo, hate speech etc. Should they pay the tax too?
2) Even if we had the money for such a judicial system, there is still the question of effectivity - the damage spreads very fast in this case, can we even design such a thing that provides good value for money while staying reasonably objective? And of course Facebook has the same problem now, as other private media (newspapers, TV stations) had before them - and the cheap solution seems to be to err on the side of caution. Are you soure we can do better?
I think it is correct, Facebook (and twitter, ...) extracts value from communication between citizens, and it looks like (e.g. from the Economist article mentiond above) also creates negative externalities that did not exist before.
Given that this is about German law ... yeah!?
> I am sure they did not ignore it when writing this law, and if they did, someone will refer it to the BVerfG, which can then cancel the law.
Which is an unfortunate tendency in recent decades, yes. Normally, the BVerfG should be an emergency brake, not a method to create maximally invasive laws.
> Anyway, private media (as opposed to public media, such as the BBC, ARD etc where the rules are very different) are not bound by laws or constitution to publish everything anyone throws at them. The corporations and individuals are free to do anything that is not forbidden by laws. So Facebook is free to decide what to publish, unless it is forbidden by law.
Yes, but that is missing the wood for the trees (and also potentially not completely true).
Facebook and the like function as sort-of public space, in that people go there to express their opinions to the public, even if the space is privately owned. Now, the state could not just prevent people from speaking in public without a proper legal process as a matter of constitutional freedoms, and that applies no matter who owns the venue. The legal process exists to make sure that only speech that really falls within the bounds of what the legislature (and by extension the constitutional court) have deemed inacceptable is stopped, so as to protect your constitutional rights. If the state tried to just broadly stop any speech that isn't obviously legal, that would be a massive violation of constitutional rights.
Now, private entities in principle are not bound by the constitution, so Facebook is free to delete whatever they want. However, that does not mean that it is therefore legal for the state to incentivise private entities to act in a way that would be unconstitutional if the state did act that way itself, as that would effectively allow the constitution to be circumvented completely. Imagine we got rid of the Strafgesetzbuch and instead created a law that entitles companies who imprison murderers to receive money from the state. Also, if you are such a company and fail to imprison a murderer, you are hit with a heavy fine. Without the StGB, the imprisonment by a private entity would be legal, and the private entity would not be bound by the constitution, so the constitution doesn't directly prevent them from imprisoning innocent people--so, everything is fine, right? Well, except it's not. The state cannot hide behind private entities when creating rules that affect constitutional freedoms. In the case of the NetzDG, the state incentivises private entities to limit legal speech, and limiting legal speech is not something the state is constitutionally allowed to do. The fact that Facebook also is allowed to delete stuff as much as it wants is not really relevant here.
But also, even property rights are limited in effectively public spaces. If you open a shop to the general public, for example, you cannot just throw out people without a good reason. I think it's conceivable the BGH might come to a similar conlusion about public online spaces at some point.
> Perhaps I missed where you mentioned raising taxes for Facebook as a solution.
I didn't explicitly mention that, but my point was that if your opinion is that they should pay for the enforcement because they make money by causing the problem, then it's a complete non-sequitur to say that therefore, they should get to make the decisions.
And not only is it a non-sequitur, it's also the complete opposite to how the rule of law normally works in a state--namely, if you want to enforce laws, you have the state do it, and if you need money to pay for it, you tax people, and if you think that a particular class of people or actions is responsible for the enforcement costs, then you try to tax them specifically. The idea that you instead effectively outsource law enforcement to a private entity is a really strange one.
How exactly to tax them correctly? I dunno! What seems pretty obvious to me is that the way taxation of such companies works right now is broken, but no clue what the correct approach would be. Though I'm not sure we should have a special tax for social networks, they simply should pay taxes here based on how much money they make here. It's just one purpose of taxes that you pay for externalities that are difficult to account individually, and that generally applies for all companies. Some cause wear to streets, some "cause" drunken people to have accidents, some "cause" people to commit Volksverhetzung. If we don't tax pubs specifically for drunk driving incidents, I'm not sure we should tax social networks specifically for hate speech incidenty.
> Even if we had the money for such a judicial system, there is still the question of effectivity - the damage spreads very fast in this case, can we even design such a thing that provides good value for money while staying reasonably objective? And of course Facebook has the same problem now, as other private media (newspapers, TV stations) had before them - and the cheap solution seems to be to err on the side of caution. Are you soure we can do better?
That is a very good question that is very hard to answer, and I suspect that the problem really needs a broader and more long-term approach than just criminal law enforcement, in particular including education.
But my primary point is that we shouldn't just ignore the constitution just because it's more convenient at the moment, and erring on the side of caution is very much the opposite of what the constitution requires when it comes to the state acting to limit constitutional freedoms. Or rather, as far as the constitution is concerned, the "side of caution" is the side of freedom, not the side of repression.
The obvious solution then would be to make it public space and not a private one. Make social networks a public utility. Then it can be governed without all the contradictions between the interest of public and private profit.
> If we don't tax pubs specifically for drunk driving incidents
However there is a special tax on alcohol (although I am not sure if this applies to all alcohol in Germany, even beer, in my country it does).
> But my primary point is that we shouldn't just ignore the constitution just because it's more convenient at the moment
Yes but what if the society is already damaged beyond repair when we finally do find a good solution?
Why does the social network have to be publicly owned for law enforcement to be "publicly owned"? Supermarkets are privately owned, but still, if I assault another customer, the case isn't decided by the legal department of the supermarket ... so what's the fundamental problem with applying that approach to social networks?
> However there is a special tax on alcohol (although I am not sure if this applies to all alcohol in Germany, even beer, in my country it does).
Well, sure, though I think the justification for that had little to do with drunk driving, given that those taxes are way older than cars ;-)
> Yes but what if the society is already damaged beyond repair when we finally do find a good solution?
Then we need to change the constitution?
I mean, the point of a constitution is that you don't just willy-nilly cross certain boundaries. You usually can still change it to a reasonable degree, but you need a large majority to do so, which should ensure that you have really thought about what you are doing.
Trampling all over the freedom rights of individuals in a panic over some perceived danger is exactly what a constitution is there to prevent, as history is full of cases where that didn't end well. There is no guarantee that that is always the right approach, but overall it seems to me that what groups of people do in a (moral) panic leads to way worse outcomes than letting some individuals do some not so nice things for a while, and constitutions exist to warn us of that danger, so we should be very careful with ignoring them.
Also, a state disregarding its constitution may be pretty damaging to society as well, as a lot of trust builds on that foundation.
I have to admit I moved the goalposts a bit. My thinking now goes like this: Many posts on facebook are not of strictly criminal nature, but are bad enough that they should not be public. And in the public sphere, there are many things that do not usually happen, that are frowned upon but not exactly criminal. Like defecating in public squares. Everyone can do that, but the absolute majority doesn't. And the reason we don't is peer pressure, public coercion, shame etc., not the threat of fines or proescution.
Now Facebook is a bit like a rundown shopping mall with gaming parlors, it stinks a lot, but lots of people use it and many use it specifically for the smell, so Zuckerberg has no motive to clean it up (unless a law forces him, like now, but then we get this discussion). What if facebook was instead a public square ran by citizens themselves. There would have to be some executive structure of course, but it would answer to the users and not to shareholders and advertisers, and that would change the whole dynamic of this situation, including moral/ethical norms of behavior.
And my experience from helping to run a small but general social network site (few thousand users at peak, no ads, 14 years running now) is that eventually this would work out better, because 'concerned citizens' gain the upper hand over trolls and hatemongers and most of the rest just accepts the prevalent norms and behave like decent people, without the thing getting too restrictive. Of course I know this won't happen with facebook and at that scale it's a completely different game but still, it would be nice.
So, no, I absolutely would not encourage anyone to use Facebook, very much the opposite, but at the same time I think that constitutional protections of freedom of speech still do apply to people who choose to use Facebook when they are using Facebook, and this law seems to me to be in conflict with that freedom.
Every large corporation makes money off people using their services for illegal purposes, that's not exactly a reason to give them powers to decide legal cases against their customers, is it?
Facebook is free to do whatever they want on their platform. There are no legal cases involved.
Are you actually confused about the fact that Facebook is not actually a court, and thus by definition whatever Facebook does is not actually a "criminal case", and that thus my point is not about whether or not what Facebook does would be referred to by a legal scholar as a "criminal case", but about whether it quacks like a criminal case, it walks like a criminal case, therefore, it is a criminal case, in all respects except for exactly the detail that it is handled by Facebook and not by a court, which is exactly the problem that I am pointing out?
I just don't agree that if "it quacks like a criminal case, it walks like a criminal case, therefore, it is a criminal case", because is handled by Facebook and not by a court.
What's Facebook going to do? Fine you? Sentence you to Community Service? Send you to jail?
Maybe the crux of the matter is that I don't believe people have a constitutional right to be able to post to on Facebook.
Limit your ability to speak in public.
> Maybe the crux of the matter is that I don't believe people have a constitutional right to be able to post to on Facebook.
That is correct. But you have a constitutional right that the state does not prevent you from speaking in public (with relatively narrow exceptions), which includes the right that the state does not pressure/incentivise other people or corporations to limit your ability to speak in public. The fact that Facebook is allowed to prevent you from speaking does not mean that the state is allowed to tell them to stop you from speaking.
> Limit your ability to speak in public.
I disagree. I don't believe Facebook constitutes public speech. I'm open to be being persuaded though.
I mean, not that I agree, this law is obviously targeted towards public speech, but I still wonder why you think targeting private speech would make it less of a problem?!
Public speech (the legal concept) does not require that you have an audience of millions, it simply means that you are speaking to an audience of people who are not in some way personal acquaintances.
It could be argued that Facebook should be infrastructure.
That's not how the law in general and constitutions in particular work!?
Constitutions don't generally say something like "citizens may speak freely". Rather, they say "the state may not limit citizens in speaking freely". And that is intentional. The constitution does not grant rights to citizens, it limits the rights of the state. It's not that the state has to somehow allow people to speak freely somewhere somehow. The whole point is that the state may just not interfere with people speaking freely. Never, nowhere, and under no circumstances. That is, except for exceptions specified in the constitution or where the constitution allows for exceptions to be created in normal law.
And apart from the fact that that just is how constitutions work: It's absolutely essential that they do work that way. A freedom the exercise of which is subject to arbitrary limitations is just not a freedom. Freedom means that you can do what you want to do, not what someone else wants you to do. Now, that does not mean that you cannot do what someone else wants you to do. But it's only a freedom if it's your choice.
In the case of freedom of speech, that means it's your choice where and when to speak. Now, as I said, constitutions only bind the state, so facebook doesn't have to allow you to speak on their platform. But constititutions that contain freedom of speech make it illegal for the state to interfere with where and when people choose to speak, and that includes indirect interference where the state pressures Facebook to not allow someone to speak.
Just as freedom of movement doesn't mean the state has to allow you to move somewhere, freedom of speech doesn't mean the state has to allow you to speak somewhere. Freedom of movement means the state may not interfere with where or when you choose to move, and freedom of speech means the state may not interfere with where, when, or what you choose to say.
The constitution guarantees free speech only insofar it is not limited by "provisions of general laws, in provisions for the protection of young persons, and in the right to personal honour".
And http://www.gesetze-im-internet.de/englisch_stgb/englisch_stg... sets forth some pretty serious limitations.
Nowhere is it mentioned whether any of this happens in private or in public.
What you won't find is engineers, of any kind, developers, or any kind of practical education.
Which means laws very often ignore practical considerations as to how they were applied. An example often given is equality between men and women. Since 1980s and up to 2005 in various countries laws were passed that married and unmarried couples are equal before the law. Including of course same sex couples, or "couples" involving more than 2 people. I'm not complaining about that concept, but it's implementation is emblematic of just how bad governments are.
1) Okay, they're equal. So when are 2 or more people a couple ? Not specified, of course (courts seem to have settled on when you share address for a given amount of time you're married. So Good News (tm) ! You were legally married if you had a roommate in college).
2) Related. When are people not a couple anymore ? Not specified.
3) What if these people separate ? Are the any obligations ? Not specified.
4) What if people are living together and one, unbeknownst to the other, incurs a great debt ? Can that debt be held against the other (like for married couples) ? Unspecified.
5) What about shared ownership ? What about registering a property/house in both their names ? Unspecified.
6) Does entering in a couple relationship (as in living at the same address for a while) give raise to obligations in case of separation ? What about kids, adopted or otherwise ? Does one party owe the other alimony if they separate in less than amicable circumstances ? Unspecified.
7) What about all of tenancy law, family law, contract law, ... all of which applies ("applied") differently to couples and separate individuals ? A million questions. All of which are unspecified. What if 3 people are living together and one adopts a kid ? Do all 3 have custody now ? How about separation ?
Congress, however, merely wanted to look good voting a very generic law into existence. There was close to zero consideration of how that affected all existing systems, and even when problems became apparent, no reaction from parliament.
And if it's this bad for things that are directly in the domain of lawyers themselves, you can imagine just how bad it is for things they don't understand.
When a government passes a law prohibiting discrimination in employment on the basis of (say) gender identity, it essentially inherits the previous caselaw from previous employment discrimination acts - so if you know how not to discriminate against women, you have a pretty good idea how not to discriminate against transwomen.
Finally, quite a lot of changes are made piecemeal by caselaw anyway - that's the basis of the Anglo-American common law. So even as adoption law (say) evolves, it does so casewise, not generally by handing a large edict from on high and breaking people's understanding of what the law is.
At least in the UK, parental responsibility (and, separately, custody) are well-defined, including the cases where there are zero, one, two, or more people holding it. Grandparents and friends raising children has, after all, been a thing for centuries. Ditto handling splitting partnerships, ditto handling unconventional domestic arrangements.
Ditto tenancy law. Ditto contract law. Particularly ditto shared ownership, which can even legally handle weird situations where people bought houses on behalf of other people who died, and...
I'm really not sure what basis your argument is actually on.
This breaks the HN guideline against name-calling in arguments. Please don't do that, regardless of how bad someone's argument is.
This is not to say that the law is fine. Its intentions, however, arguably are. To oppose this viewpoint, you would have to defend a world where hate speech and libel go (erratically, but mostly) without defense for the victims. From a very libertarian viewpoint, that may be an acceptable sacrifice, though.
Source? Because as a German I strongly disagree with that statement.
This is german legal culture: our law system is first, foremost and mostly codified. And regulation is used more heavily than in the US law system.
Note that the stated goal of the law is most probably the exact point: making access to effective defense of your rights possible to anyone. The reality consisted of slow law enforcement (which has to act against people who made the speech in question, often anonymous or denying having done it), inaccessible data of the other party (for civil suits) and an intransparent mechanism that did not follow the german legal system on behalf of the corporations.
Which is completely and utterly fails at in the obvious way, in that your right to free speech is limited by a strong incentive for corporations to silence you if in doubt?
> The reality consisted of slow law enforcement (which has to act against people who made the speech in question, often anonymous or denying having done it)
And the solution to insufficient law enforcement is to have some private corporation do the job instead? I don't think that many people disagree that there was a problem, but the solution is terrible and creates lots of problems itself.
This is not about legal legal sanction, but about de-facto legal sanction. By definition, Facebook is only exercising its property rights. But that doesn't change the fact that they do limit perfectly legal speech as a result of a law that forces them to decide whether a given speech act is illegal, thus putting them in the de-facto role of a court, just with completely skewed incentives.
If the same law said that a judge would have to pay heavy fines if they didn't order the deletion of "obviously illegal speech", that would incentivise judges to not respect the constitutional rights of citizens, right? And would therefore probably be unconstitutional, right? Now, this law has effectively the exact same effect, by simply making Facebook the de-facto judge with that incentive, just with the added problem that the people making the decisions have no clue of the law.
Now that I have answered your points, please do me the courtesy of answering the question I posed above instead of deflecting it.
Well, I am not sure I complete agree (it is obviously substantially different from traditional publishers, but at the same time it's substantially different from mere printers or telcos, so I don't think it's an obvious classification), but sure, let's say that is the case.
> You complain that the people making the decisions have no clue of the law, but nothing stops Facebook hiring German staff with the requisite legal knowledge to discharge that obligation, or alternatively declining to accept connections from German IP addresses.
Which doesn't change that they in fact don't, because they have no incentive to, and that this is the behaviour that is to be expected given the incentives created by the law. The law does not create an incentive for legally educated decisions that preserve constitutional rights, the law creates an incentive to remove illegal content. The state can not hide behind private entities by pretending that the incentives it creates have nothing to do with how those private entities act. Facebook is not bound by the constitution, so they have no reason to hire legal experts to make these decisions, they will simply follow the law. It's the responsibility of the state to not incentivise private entites to violate constitutional freedoms, not the other way around.
> Now that I have answered your points, please do me the courtesy of answering the question I posed above instead of deflecting it.
I don't think I have deflected any question, rather your questions were missing the point. I don't agree with a private corporation deciding what for all intents and purposes are criminal cases and limiting constitutional freedoms based on incentives created by a questionable law. None of that implies that anyone has in fact declared Facebook a court or that Volksverhetzung should not be prosecuted--it simply means that Facebook is not the entity that should be incentivised by the state to make decisions that limit constitutional freedoms that would be obviously inacceptable if the state made those decisions itself.
If you think there still is an unanswered question, please let me know what it is.
You might feel that Facebook is obliged to choose the lowest-cost option on behalf of its shareholders. This isn't the case, legally. Conversely, Germans who want to express controversial views that might be defensible and don't want to just have them auto-deleted can always seek out another publisher or publish themselves on a blog. When I say people don't have an automatic right to publication, I mean they don't have the right to have a particular publisher host their controversial content. Presumably you would agree that Facebook is within its rights when it declines to host pornography, for example, even though I think the community standards' it cites are prudish to the point of ridicule (eg banning people for photographs of statues depicting nude figures in art museums).
I am not sure I would categorize Facebook as just a publisher, but anyway ...
> You're blaming the state for Facebook's commercial decision to use blanket deletion.
No, I am not. I am blaming the state for creating the incentive for that decision. The fact that Facebook could decide to act differently does not change that the state's actions are what incentivises them to act this way. The fact that Facebook is morally to blame for their behaviour is simply not relevant to the question of whether the state is incentivising a violation of constitutional freedoms. That is precisely what I mean by "the state can not hide behind private entities". The state cannot incentivise a private entity to act in a certain predictable way and then pretend their incentive has nothing to do with that entity's actions just because that entity would have the legal but risky and/or costly option of acting in a way that would not affect constitutional freedoms.
For an analogy I constructed to demonstrate maybe more clearly the problems with that approach, see also the first few paragraphs here:
> You might feel that Facebook is obliged to choose the lowest-cost option on behalf of its shareholders. This isn't the case, legally.
Which is irrelevant, see above. No, they are not obliged to. But they can. And the state has to consider the fact that they can when it creates incentives.
> Conversely, Germans who want to express controversial views that might be defensible and don't want to just have them auto-deleted can always seek out another publisher or publish themselves on a blog.
Which is also irrelevant to the question of constitutional freedoms. For one, the same incentive applies to all "social networks", so potentially people's constitutional freedoms are at risk on all such platforms, but also, and probably more importantly: A constitutional freedom does apply globally. It's not that the state has to allow you to speak somewhere, it's that the state may not interfere with you speaking (obviously within the limits established by further laws, as far as the constitution allows for such laws, and also within the limits due to collision with other constitutional rights, such as property rights, but that's not the point here). "We only prevented him from speaking on the market square, but he was free to speak on the corner of his street" is not an sufficient justification for it not being a restriction of a constitutional freedom. It's not up to the state to decide where I can speak, neither directly nor indirectly.
Also, it's not just that that is the established understanding, it's an absolute necessity: The freedom to speak your opinion is completely void if you cannot choose your audience. It is not about you being free to make an utterance, it is about you being free to make yourself heard. Just as freedom of movement is not about being allowed to move a bit, it is about you being free to choose where you go, and in particular to go and live with people you want to live with.
> When I say people don't have an automatic right to publication, I mean they don't have the right to have a particular publisher host their controversial content. Presumably you would agree that Facebook is within its rights when it declines to host pornography, for example, even though I think the community standards' it cites are prudish to the point of ridicule (eg banning people for photographs of statues depicting nude figures in art museums).
Yes, but none of that is relevant to the question of whether the state is allowed to incentivise Facebook one way or the other. Constitutional freedoms restrict how the state may interfere in the lives of poeple. The fact that Facebook could choose to act the exact same way of its own accord is irrelevant to the question of whether the state may incentivise it to behave this way.
Any private landlord could legally decide to not rent their house to people with glasses. That does not mean it would be constitutionally acceptable if the state created laws that incentivised landlords to not rent to people with glasses. Just because people are allowed to behave a certain way does not make it necessarily constitutional for the state to create incentives to behave that way.
Exactly. There was no need for this new internet censorship law at all, for they could've just started applying existing hate speech laws.
Instead they introduced this new shady law, which essentially turns social network companies into a part of the judicial branch.
The situation now is exactly as you have described it -- the companies fear the German fines and act overzealously.
Perhaps an outsourced, overzealous judicial branch is exactly what the German government wanted? As they're free of the restraints that bind the official judicial branch.
The new law (NetzDG, roughly "network enforcement act" ) was intended to facilitate applying existing laws against illegal content, by mandating that every site with > 2 million users needs to have a clear procedure for reporting illegal content.
Unfortunately, there are only fines for failing to delete obviously illegal content and none for just deleting everything that is reported and not obviously legal. The law makes provisions for handing over tricky cases to an external agency, but expected the companies to jointly set it up themselves ("regulated self-regulation"), which they didn't do; likely because the people checking reports at such an agency are legally mandated to be competent.
This is not the case, there is no fine for failing to delete obviously illegal content.
The effect is the same: the easiest way to implement the requirements in the law is to default to deletion.
The NetzDG is bad law, and will probably be repealed before the Constitution Court rules it unconstitutional.
But it was a reaction to social networks disobeying the law. An overreaction, yes, but a reaction.
I wish it were more commonly known in libertarian and net freedom circles that the pendulum swings both ways. After excesses in one direction it will swing far into the other direction.
Maybe keep that in mind when pursuing extreme goals at any cost, instead of find8ng a middle way that works for everyone.
I think the NetzDG is crap and won‘t stay for long. But I also think that Facebook and Twitter deserve all this pain.
Really, I don't see many people in "net freedom circles" who object to forcing Facebook to delete illegal posts. The objection is rather to (a) deleting stuff instead of prosecuting the perpetrators, (b) hiding (censoring) stuff that's still there, and (c) making parties other than the judicial system responsible for limiting speech.
But, yeah, sure, I don't care about Facebook or Twitter either.
And I see virtually everyone in the net freedom circles objecting to FB deleting illegal posts. Because our laws are unjust, you see. Only American laws count.
We really need to teach Facebook, Twitter, Uber, AirBNB and all those companies some manners and respect for our society.
Maybe you should entertain the idea that they are? It wouldn't be the first time. And what if every country had the same long arm attitude of pushing their opinions on foreign companies?
As a German, let me tell you that this sounds absolutely ridiculous, arrogant and hubristic. You want your German Extrawurst. We're not "one society", not everyone agrees with the tons of laws and regulations we have (mostly on account of special pleading and lobbyism).
German users came to Facebook and Twitter first, not the other way around. These are not German companies, they don't need to operate out of Germany, so why should they have to follow our every law? If anything, you'd be "teaching" them not to bother with Germany, because it's too much of a pain in the ass.
If Facebook and co. don't want to operate in Germany, they are free to withdraw from that market. They clearly target Germany and there is a lot of money to be made here. If they want that - fine but please respect that we play by our rules here.
No, we don't. Who taught you this nonsense? In fact, it is our duty as citizens to fight bad laws, no matter if they were "democratically chosen" or not. In Germany, we don't choose laws democratically, either. We elect representatives who may or may not do a good job at legislation, and it's our duty to scrutinize that.
Don't forget that Hitler was elected democratically and pretty much everything that happened from thereon was legal under German law.
Please don't cross into name-calling and personal attack when posting to HN. This is in the site guidelines. So is the following:
Comments should get more civil and substantive, not less, as a topic gets more divisive.
Art. 20 II of our constitution. That's the philosophical foundation of how our state works. Every state action is derived from that principle. That's of course no nonsense but thankfully absolutely unchallenged by everybody who teaches you about our state.
Not every state action is good, some are even objectively bad. Of course working against such is a civic duty. But wanting to enforce a law against corporations that interacts with and clearly influences people here is neither ridiculous, nor arrogant or hubristic.
> Don't forget that Hitler was elected democratically and pretty much everything that happened from thereon was legal under German law.
Well, apart from basically all the things that granted that period a special place in every history book hopefully forever.
We integrated various safeguards into our constitution and laws to ensure that something like that never happens again. This law (no matter how bad) is a piece in that puzzle and should be viewed as such.
What you sounds does not follow from that article - your are mistaking an "ought" for an "is"! If our duty was to do no more than "respect and enforce" laws, how is it that the supreme court declares laws as unconstitutional on a fairly regular basis? An unconstitutional law is best left unenforced and disrespected, is it not?
>> Well, apart from basically all the things that granted that period a special place in every history book hopefully forever.
No special pleading allowed! Besides, there are countless less extreme examples of "democratic processes" leading to unfavorable outcomes. It's just not a strong argument for anything. Witch burning was a grassroots democratic movement, if you want to look at it that way!
>> We integrated various safeguards into our constitution and laws to ensure that something like that never happens again. This law (no matter how bad) is a piece in that puzzle and should be viewed as such.
The road to hell is paved with good intentions. If the previous N laws to "ensure that something like that never happens again" don't, in fact, ensure that "something like that never happens again", then why would the N+1th law make any difference? The same process that created this law can be used to repeal it. Its very existence gives strength to the far right.
Now, that is an interesting interpretation of the rule of law if I have ever seen one!
So, in your opinion, having to follow the laws of a country you visit or that you do business in is somehow illegitimate because it's an Extrawurst? Is it your expectation that the US will apply German law to you when you visit the US?
> These are not German companies, they don't need to operate out of Germany,
> so why should they have to follow our every law?
Because they do operate in Germany?
> If anything, you'd be "teaching" them not to bother with Germany, because it's too much of a pain in the ass.
So, we shouldn't apply laws because that could be teaching people to not bother with Germany? So I suppose you think we shouldn't apply laws at all, then? It's not like Germans couldn't end up learning that they should leave the country if they have to follow laws, is it?
That isn't an "interpretation of the rule of law", that's a response to the bold statement that "We really need to teach [...] all those companies some manners and respect for our society".
Yes, we can introduce all sorts of stupid laws to make foreign companies do as we please or have them leave. No, we're not "teaching" them "manners and respect" for our society. We killed millions of people, not even a hundred years ago, but now we're the paragons of virtue? Give me a fucking break...
>> Because they do operate in Germany?
Clearly whatever they did was legal in however way they operate "in Germany", or there would have been a German court case.
Actually, what they did was not legal. Not appearing in court when a judge orders you to is not legal. It just so happens that it's not enforcable when the person ordered to appear happens to be in a different country. In order to limit the negative effects of this illegal behaviour, now more of their behaviour has been made illegal in such a way that if they don't follow those rules, it can actually be enforced.
That's not actually true.
I don't see how this was relevant in any way to the discussion. What happened 100 years ago should of course be considered in discussions and in the law-making process today and it should not be forgotten, so that it will not repeat. But it isn't fair to still judge people born in a country where some of their grand(grand)parents did bad things.
Of more simply 'ignore laws that aim at preventing a repeat of a crime because of the commission of that crime.'
I'm not talking about this fantastic new hate-speech law per se, I'm talking about the broader attitude in Germany where we just expect foreign business to gratefully deal with all of our bullshit, because we really are a bunch of narrow-headed cultural hegemonists.
Imagine if Turkey asked Facebook to enforce Article 301 of the Turkish Penal Code. The reaction in Germany would be widespread ridicule, probably by some of the same people that demand this new law enforced because it's German law.
Then you should have made your general point at the outset. Surely it's clear that the law applies to people providing services to consumers in Germany.
Imagine if Turkey asked Facebook to enforce Article 301 of the Turkish Penal Code.
In Turkey? OK I guess. I haven't bothered to look up Article 301 or even whether it exists since I presume you're making a rhetorical point.
I have a problem with how they want to make money here, but illegally.
And yes, I want rules to be decided democratically. Not by „what does hardwires think about it?“.
So? Now, there is a law, and so what was legal before is not anymore. That is the point of creating new laws.
> They can still make money off a German audience without any business presence in Germany. Legally, from the US.
Where you are acting from does not have any influence on legality. If Germany has a law that makes what someone does in the US illegal, then it's illegal. Acting from the US only limits enforcability. If Facebook has no presence or business in Germany, Germany cannot do anything to enforce that law against Facebook. But it seems like Facebook is of the opinion that they have enough Business in Germany that it's worth for them to comply with the laws to be able to keep that business.
> Advertisers don't all discriminate.
Well, but most advertisers that want to advertise to a German audience are from Germany. If you want to be paid by advertisers from Germany, you better follow German laws, or Germany will seize the funds your German advertisers are paying you to pay fines or other debts that you owe in Germany.
That's my whole point, it was legal before and they weren't "making money illegally" (as the parent claimed), hence my the need for new laws.
>> Where you are acting from does not have any influence on legality. If Germany has a law that makes what someone does in the US illegal, then it's illegal.
It's not illegal in the US, from which they are operating. That's what matters. Close to 100% of porn sites are probably doing things illegally from a German legislative point of view. They're making money off of horny Germany nonetheless. So, the idea that our laws can "teach" them is ludicrous.
>> But it seems like Facebook is of the opinion that they have enough Business in Germany that it's worth for them to comply with the laws to be able to keep that business.
Yes, they chose that tradeoff over leaving the market, because at the end of the day they don't give a fuck. They did the same for China. They are probably working on AI-based deletion bots as we speak. What a great success, eh?
>> Well, but most advertisers that want to advertise to a German audience are from Germany. If you want to be paid by advertisers from Germany, you better follow German laws, or Germany will seize the funds your German advertisers are paying you to pay fines or other debts that you owe in Germany.
Oh really? When has that ever happened? If I turn off my adblock, most of what I get is international brands (YMMV), so even if all the German advertisers were gone, it would be a drop in a bucket. Maybe big enough of a drop for the company to care, maybe not, but the point remains that it is that the users stand to lose more here than any of the companies involved.
Yeah, once upon a time, murder was legal, and people weren't "murdering illegally", so why the need for a law against murder?
> It's not illegal in the US, from which they are operating. That's what matters.
Well, if you aren't interested in reality, why are we even discussing?
> Close to 100% of porn sites are probably doing things illegally from a German legislative point of view. They're making money off of horny Germany nonetheless.
German law itself limits its geographic applicability, and therefore that limit can be changed just as any other rules in the law.
The POINT is that the parent claimed Facebook is acting illegally, my point is they were not!. There's no argument against the law itself to be found within that, you can stop trying.
>> Well, if you aren't interested in reality, why are we even discussing?
What are you talking about? I am discussing reality. You must be missing the point again, which is the following: Even if something is illegal in Germany, you can not enforce German law on every company that stands to profit from a German audience. Therefore, if you want to use German law to "teach" those companies, you're out of luck.
As a German myself, I am very glad about this, because (for example) I don't have to wait until 10PM and verify my identity to consume government-sanctioned pornography, as German law would require. You don't see a lot of people asking for respect and enforcement of those laws, yet as soon as it comes to "hate speech" the nanny-state-conformists come out of the woodwork...
>> German law itself limits its geographic applicability, and therefore that limit can be changed just as any other rules in the law.
The porn law applies in the same way that the hate speech laws apply, insofar as they're dealing with German "customers".
Except they weren't? Not respecting the law is not the same as acting illegally. If a Chinese manufacturer exports toxic food to Germany, is that illegal (according to German law)? For all I know, it's not. Does that mean that they are respecting German food safety regulations? No. Would it still be a problem if Chinese manufacturers exported toxic food to Germany? Yes. Does the fact that exporting toxic food to Germany is not illegal mean that it's not legitimate for Germany to try and find ways to prevent export of toxic food to Germany using its legal system?
> Even if something is illegal in Germany, you can not enforce German law on every company that stands to profit from a German audience.
Well, true? How is that relevant to Facebook, say? As far as I know, Facebook is not every company?
> Therefore, if you want to use German law to "teach" those companies, you're out of luck.
So, you are saying the German legislative effort known as NetzDG is not having any effect on how Facebook behaves?
> As a German myself, I am very glad about this, because (for example) I don't have to wait until 10PM and verify my identity to consume government-sanctioned pornography, as German law would require.
So, because you disagree with a specific law, you think the German state should not enforce any laws on foreign corporations? Because you think that German limits on porn are idiotic, you think an American bank defrauding German customers should not be held accountable, say?
> You don't see a lot of people asking for respect and enforcement of those laws, yet as soon as it comes to "hate speech" the nanny-state-conformists come out of the woodwork...
Which has what to do with whether laws should apply to foreign companies exactly? Are you seeing a lot of people asking for enforcement of porn laws against German companies?
> The porn law applies in the same way that the hate speech laws apply, insofar as they're dealing with German "customers".
I'm not sure it does, but in any case: Is it really a good state of affairs that it's enforced selectively-ish? It's not exactly protecting the constitutional rights of porn stars and porn producers in Germany, is it? Sure, overall, it might be better that at least foreign porn is accessible unrestricted than none at all, but is that an argument for selective enforcement, or not rather against restrictions of porn?
Yes, they chose that tradeoff over leaving the market
They don't want to leave the market because the market is where the revenue stream originates. Markets are local; the location of the HQ is irrelevant.
Intent never matters. What matters is what happens in reality. The road to hell is paved by good intentions.
We had very strong censorship systems built by the government, but does it stopped hate speeches? No.
In fact, if you make people can only post & receives filtered info, then you are actually pushing them to an echo chamber. And an echo chamber is a very nice place for people to fermenting hate speeches (I'm not saying it will, but it could).
So in my opinion, hate speech is bad, but censorship is worse.
The problem must be solved in a different way, without introduce censorship.
Do you know how strict those laws are? I mean, you just said that, albeit in a quotation, so I imagine that at least is out of the scope of the law? Or would it be something that got you in trouble if someone reported it?
No, quoting an example of Volksverhetzung is not illegal, if you're not actually attempting to incite hatred.
I can give you a counter-example (although in France where we have similar laws). Patrick Sebastien, who is a well-known French humorist has been convicted for exactly that (he impersonated Jean-Marie Le Pen, far right politician, in a satire song "casser du noir").
Sorry but I can only find a reference to a French article http://www.liberation.fr/medias/1996/03/13/patrick-sebastien...
The judges concluded that this was indeed an incitation to racial hatred, which is illegal in France.
What you seem to be taking about is the new one, which breaks many things in the hopeless attempt to maximise enforcement of the original laws. This is a recurring theme in lawmaking: everything has a certain maximum level of reasonable enforceability, and when laws are written to increase enforcement beyond that limit they universally turn out bad.
On the other hand the law got there because sites like Facebook - and yes also Heise - do a questionable job at handling racist content. And there are tools beyond filters, e.g. user ratings, real names on social platforms etc etc I mean even Facebook themselves are stating that even Fakenews (which is not necessarily racist or so) poses a problem to our society. I mean...HN has such a simple solution for this, the rating button, why is Facebook not able to come up with something comparable?
And yes... the article above states that such posts correlate with physical violence. The state kind of pulled the emergency brake.
> "kill all .. they are ..."), ... The internet is one of the last places
> where people have some sort of freedom of opinion/listening to
> different opinions
This is not an opinion. This is a request or a command. Sorry, writing this ethically a complete no go.
Simple indeed. This rating system must why it is so rare to see objectionable content on social media like Reddit.
Reddit implements only a rating button as far as I know. (Sorry not a regular user, but I bet it has one or two other mechanism to prevent people from posting complete trash regularly.)
Actually before Facebook become popular, almost all social media platforms were practically anonymous. In the middle of the 90s this worked IMHO quite well because only few people had access anyway. So you could see that Forums, chats etc became trashier. Facebook came up with real identity profiles (real name, picture etc etc), so the quality of content and contacts got exceptionally high for social media.
I guess another leap in this direction might become necessary. Not sure what this would be, but if we don't want state to babysit our social networks, the site operators better come up with innovations in that area.
I have only skimmed through the paper, but I am a bit skeptical about their conclusions. The authors first showed that the attacks-on-refugees time series is strongly correlated to the anti-refugee-posts time series. This part is very basic and is OK.
Then they tried to exhibit a causal relationship between the two series, and it is the trend of posts that drives the trend of attacks but not the other way around. This is where I have doubts. Apparently, the authors didn't try something basic first --- like a Granger causality test --- but applied a somewhat obscure regression procedure to establish causality.
This procedure, which they called a Bartik-type approach, was invented only in 1991. Now I'm not familiar enough with time series analysis to know the merits of this Bartik-type approach and why Granger test is not useful in this case, but from the graph in the Economist article, it seems that the trend of attacks is leading the trend of posts, so I wonder if they were deliberately omitting a Granger causality test that might refute their conclusions.
It's just a popular correlation test in the field of autocorrelated time-series.
A correlation between physical violence and negative verbal sentiment doesn't seem that surprising. That talk on an online forum precedes violence is plausible.
Claiming talk online /causes/ violence is a little surprising and has dangerous political implications, ie, makes a case to target speech of AfD, which polled a respectable ~10% in the last German federal election.
I don't agree there is a distinction to be made here. The real world subsumes social media. Certainly violent crimes such as arson and assault are different, and more severe, than hateful or abusive posts or direct messages. Neither category is less real.
I think the context is that "real world" is supposed to mean it has a physical effect. The implication is that physical crimes are legitimately categorized crimes and speech crimes are not exactly legitimate (according to where you say words, they might be criminal).
I believe that some words that make someone/some group possibly feel bad (intrinsically, not just to libel), are not a real harm that is in need of remedy. Even if it is something that is known to make someone feel bad, I wouldn't consider it legitimate. It's always subjective and I don't subscribe to that kind of only-use-doublespeak-watchdog thinking.
It's not about feelings.
It's about shaping other people's opinions.
If someone says "jack9 is a kiddy fiddler" the damage is not that it might make jack9 upset, the damage is that others might think it's true and take actions that harm him.
If someone says "kill the kikes" the damage is not that Jewish people might be upset, the damage is that others might agree and commit violence against Jewish people.
It doesn't even have to be a convincing argument, it just has to give other people the impression their hatred is socially acceptable and their violence has public support.
Hate speech is like broken windows: the more a certain hateful sentiment is tolerated in society, the more socially acceptable that sentiment becomes, the more likely it becomes someone will act on it and that society will tolerate those actions too.
Hitler's rise to power was not a coup d'etat. Hitler didn't overthrow the government by killing generals and politicians and installing himself as leader. Hitler rose to power by manipulating people's sentiments with language. He created scapegoats and convinced people it's okay to channel their hate and frustration towards those scapegoats. When the Nazis first called for people to be killed or that their neighbors were inferior, few thought they meant it, but saying it over and over normalised the idea. They dehumanised their victims and their future enemies.
A lot of that rhetoric is normal in many places today and Trump's campaign wasn't even a special example. The rhetoric after 9/11 made Germans feel icky because it was so familiar but even then it wasn't new. Americans worry about racism and sexism, Germans worry about xenophobia -- hatred of the foreign, the different, the "other".
While we're going full Godwin: Nazi Germany started out as the victim of the victors of WW2 and socialist traitors. Germany's entry into WW2 literally was as the victim of Polish aggression. None of that is really true if you look at it objectively, of course, but that was the language used by nazis at the time.
If you the only thing making your behaviour morally righteous is your victimhood, maybe your behaviour isn't righteous at all.
I'm socialist-libertarian myself but that doesn't mean I tolerate the same damaging behaviour from "my" side.
The same people in the US who are scared of their government taking stances like "man-made climate change is real", "same-sex couples should be treated the same" or "maybe some people shouldn't own firearms" are the same people who comfortably arm the police to the teeth, send American soldiers to countries they wouldn't be able to find on a map, perpetually increase military spending, and demand the government regulates what people can do in their bedrooms or what public toilets they can use.
Americans aren't afraid of government getting too much power. Americans are afraid of government taking on positions they don't personally agree with and partisan thinking has led them to believe that "the others" are one term away from ruining the Greatest Nation on Earth and to think that as long as "their team" wins on election day, everything will be perfect.
People should have the freedom to be persuaded to believe things that you don't agree with. It's OK to have a "wrong" opinion. Let people say what they want and believe what they want. I'm stunned that I still have to defend freedom of expression.
So I disagree. It isn’t right to form a gang on the street and attack someone, and it isn’t right to form a gang on the internet and incite violence, or strongly disrespect a person or people. Even if you take at first step the difference between actual physical harm VS a ‘bytes on the internet’ approach, the mental health harms to the persecuted group are real and have lead to suicides in individual examples, or actual violence as demonstrated in the article.
Speaking from a medical perspective, I don’t draw a difference between physical and mental harm (precious snowflaking aside).
Sure it’s a grey area but carte Blanche in this domain is damaging.
If Joe Blow tweets "kill the kikes", nobody cares.
If someone with millions of devout followers says the same, there will eventually be blood.
Or in your case, if a person with influence tells their fan club to harass someone, the victim may suffer actual mental health issues when enough of them act on it.
The whole idea seems to be: murder and genocide are so bad, that any legislation that either statistically reduces them, or even just seems like it might do, is worth it. I'll grant that I can see how a reasonable person might think this. Then it's a question of values. Remember that one of our great American heroes literally said, "Give me liberty, or give me death," immediately after doubting that life is dear enough, or peace sweet enough, in themselves.
As far as that goes, I'm old and it saddens me that we keep going from one oppressive cultural regime to another. When I was young, there were also things you couldn't say, they were just different things. But I will say that this is the first time in my life that freedom of speech itself has been explicitly challenged by a significant bloc in the US.
You're misunderstanding what the idea is. The idea is to show german children:
That person has that opinion and they may think it, but the opinion is wrong. We had a huge big argument over it, some people call it WW2, and the result was: Their opinion is wrong. So we decided to let that be the last argument about it and not let them start another argument, because the question has been answered at length.
I was referencing one of the original comments made by a German far-right politician on Facebook that started the debate about the responsibility of platforms like Facebook in moderating hate speech which in turn resulted in the law the article describes:
It's not a slippery slope in this case. I admit it's a bit unfair to reduce psyc's argument to this but the language the law covers is entirely in the category of "people openly demanding that certain other people be killed or subjected to violence". Facebook dragged their feet to moderate comments like these and the law is a response to create a legal tool to force companies like Facebook to moderate hate speech or face serious fines.
I guess I can see why Americans would object to it: political hate speech is weaponised speech and Americans are ridiculously obsessed with being allowed to own weapons. It's that absurd idea that when the time comes and the others have infiltrated the government, the common man will take to arms and overthrow it (because presumably the military will defect and not just kill everyone).
It's true, I'm advocating some limits on people's freedoms. I don't think people should enjoy the freedom to launch campaigns of genocide, for example.
The Treaty of Versailles was crippling the economy, the royalists felt betrayed by the social democrats who ended the Great War by surrendering, the Great Depression destabilised the banks, and so on. Even a bullet in the brain wouldn't necessarily have "stopped Hitler" at the time (the reason we talk about "stopping Hitler" is that Hitler wasn't stopped so we don't know who'd have taken his place and where that would have led the movement at the time).
But this is irrelevant because 2010s Germans aren't 1920s/1930s Germans. West Germany underwent a massive process of Denazification, we entered into and established ourselves in the European Union and Germans who grew up in post-war Germany are used to the idea that incitement is a crime and racial hatred has no place in a civilised society.
It's irrelevant whether criminalising incitement ("censorship") would have stopped Hitler in the 1930s because a single law doesn't instantly transform society. It's 2018 and this is not a new law. But we'll never be able to tell whether it stopped Hitler today because if we stopped him we won't know who he is because he'll never ascend to the same significance in history in the first place. It's a strawman.
You also seem to be under the false presumption that you having legal "freedom of speech" means you can express anything you want without consequences. As is being pointed out all the time, freedom of speech doesn't mean freedom from consequences, just freedom from direct legal consequences.
If you hold no significant political power and say something other people find disagreeable, you will be called out at best and ostracized at worst. In fact, freedom of speech might actually help those seeking to harm you because they have the righteous fury on their side.
If you hold significant power, there are plenty of examples in US politics how that usually works out (though time will tell how it works out for Trump in particular).
Fun side effect: dehumanising people also makes it easier to justify war. The US has been in a perpetual state of war since WW2. Do you even remember how to recognise propaganda and war rhetoric if it's been part of the mainstream since before you were born?
You seem to understand that speech can persuade people of wrong things. Do you think it's possible you've been wrongly persuaded to hold the belief that free speech trumps all?
Maybe the same people that persuade you that you need free speech above all are also those that have the most to gain from being able to persuade you freely of whatever they want?
If you think incitement should not be a crime and likely think libel and slander should not be a crime, where do you draw the line if fee speech trumps all? What about lying under oath?
I'm going to guess (based on your name and the demographics of HN) you're American. Your nation's history is entirely written by itself and its only revolution was a secession which it was able to defeat.
Maybe your country just hasn't learned the lesson Germany has?
Just a thought: https://thenib.com/the-good-war
I don't think libel and slander should not be crimes, I don't think lying under oath should not be a crime. I believe in freedom of expression.
I especially encourage those people with the absolute worst most harmful ideas to speak often and loudly and publicly, so that everyone has a chance to hear their ideas and think about them and come to understand for themselves how bad they are.
Freedom of expression means I can sit back and let Westboro Baptist Church explain to everyone what terrible people they are much more eloquently than I could ever hope to do. Neo Nazis, please, have a rally and explain to everyone in your own words that you latch onto pathetic ideas of racial superiority for lack of any accomplishments of your own to be proud of.
Why do you want to restrict freedom of expression? Do you find it hard to refute somebody's argument? Do you not trust people to sort out good ideas from bad themselves? The solution to bad speech is more speech. I thought we'd settled this decades ago.
But history is replete with examples of people not doing that, and instead going ahead and implementing the really terrible idea. The Holocaust. The Killing Fields. The Rwandan genocide.
Neo Nazis, please, have a rally and explain to everyone in your own words that you latch onto pathetic ideas of racial superiority for lack of any accomplishments of your own to be proud of.
Some neo-nazis in the UK yesterday attempted to perform a 'citizen's arrest on the mayor of London', complaining about his being a Muslim and saying rude things about Donald Trump. Eventually the police showed up, declined to assist, and they were laughed out of the room.
It wasn't realized until later that they'd brought along a mobile gallows and parked it outside. I'll give you three guesses as to what they planned to do with it had they been successful in laying hands upon him.
Do you find it hard to refute somebody's argument? Do you not trust people to sort out good ideas from bad themselves? The solution to bad speech is more speech.
No, not really, and I'm not so sure about that. We don't live in a society of angels and educating everyone to think in enlightened terms is a slow and uncertain process. I don't like any sort of state coercion and would prefer to see prison abolished too, but I do not fool myself that crime would cease to exist if I could force that outcome. Consider Graham's law from economics, which holds that bad money drives out good; a small number of bad actors can undermine the reliability of the economy for all participants.
I must have missed the part in WW2 where the Axis surrendered because Hitler lost a debate with Churchill and all the concentration camps were shut down because the guards realised they were the bad guys.
* person A might say something outlandishly racist thinking it's obvious they can't be serious
* but person B actually thinks that's something perfectly reasonably to say and agrees
* which person A in turn misunderstands to mean person B is in on the joke
* 30 minutes later person A is wearing a Klansman hood in front of a burning cross and wondering how they ended up in that situation as it dawns on them they're surrounded by actual racists
I think this is how a lot of Internet trolls felt after the US elections. Some trolls just wanted to see the world burn (so Trump is the gift that keeps on giving) but many simply didn't think Trump could win (because surely people would just go for the unlikeable but safe establishment candidate rather than the obviously outrageous joke candidate).
The problem with maintaining that distinction in the digital age however is that it's very hard to tell apart the satire from the real thing. The joke works because the audience is in on it (e.g. rape jokes work because the audience and the joke teller have a mutual understanding that rape is horrible and joking about it is inappropriate -- the inappropriateness is what makes it funny) but the Internet is not the social equivalent of two friends talking smack in a pub or a comedian talking on stage in a comedy club, it's the equivalent of random people writing on a public wall or shouting in a massive crowd of strangers.
When talking to friends there's a mutual understanding of what is or isn't appropriate. There are numerous social cues that can indicate something deadpan is still meant as a joke. Even an actual misunderstanding may still come around as a funny story.
In a comedy club the environment sets the expectation. If the guy on stage calls to arms and rallies the crowd to overthrow the government the context makes it clear this isn't a serious attempt at rebellion but more likely a build up to some kind of punchline.
But on the Internet all that context is lost. Even a conversation between friends can still likely viewed by thousands or millions of other people, so the words stand on their own.
It's obviously still important to have these private exchanges but it's also obviously not appropriate to have them in public spaces where every passer-by might become a participant.
The leading paragraph claims that "Trolls and right-wing activists conspire through pr0gramm, Discord and Twitter to 'turn the law against hate speech' into its opposite: a tool to denounce left-wingers, women and migrants."
In other words, the opposite of "hate speech" is "left-wingers, women and migrants" according to the writer of this article. This shows the incredible myopia of these activists who have been warned time and again that those same censorship laws they were pressing for and sponsoring could and would be turned around to target their own political platforms. Unfortunately they did not listen, often because they blocked or banned those who spoke out against censorship as racists, xenophobes and more.
Regrettably, no human system seems to avoid all violations of human rights. But the totalitarian systems will have the worst excesses.
Or they knew that perfectly well but considered it a small problem relative to the harm they seek to prevent. Everything comes with a downside risk, but pointing out the fact of its existence tells you little about the scale of its severity..
: http://blogs.faz.net/deus/2017/12/18/erdogans-mob-und-gruene.... (in German)
: http://www.faz.net/aktuell/gesellschaft/menschen/deutschland... (in German)
Links to Google's translation service appear broken for me, though.
> Eschew flamebait. Don't introduce flamewar topics unless you have something genuinely new to say. Avoid unrelated controversies and generic tangents.
e.g. If you're familiar with Steam, you might have noticed that a lot of games that are generally available in EU are not specifically in Germany.
Whats their next research paper about? Water is indeed wet?