I'm wondering what the HN opinions are on this view. My guess is that the argument will be that this will stifle innovation. But Facebook has such clout that algorithms have very direct effect on people, up to even the security of a state. I hear Facebook was a major contributer to the ethnic cleansing of Rohingya. Is algorithm deployment subject to (government) audit? Or are there alternatives to ensure the algorithm does nothing bad?
I don't think "algorithms" changes this at all. People who have bad intentions are guilty. People who ignore or bypass established safety protocols are reckless. But people who made a good-faith effort to do something legitimate and turned out to be mistaken? There's no real precedent for punishing those, and it doesn't seem fair to make a special exception just because technology is involved.
Yes there is, if they failed to thoroughly consider the consequences of their actions, then we would call it negligence.
In law this is called "mens rea", or "guilty mind", and it exists in a hierarchy: Accidental, Negligent, Reckless, Knowingly, Intentionally, and we punish people differently based on which level of guilt they have.
There's another axis that courts consider when deciding charges and sentences called "actus reus", or "guilty act", and it's supposed to be represent how responsible the person is for the harm rendered. This also exists in a spectrum, from "proximate cause" (you did the thing that immediately caused the harm) to "coincidence", (you stabbed a voodoo doll, and the person got hurt, with no chain of causality linking the two events)
I would argue that Facebook engineers were negligent with respect to, like, the fake news fiasco, though probably not very responsible, being enablers rather than actors. Same with the racial loan targeting scandal.
The depression experiment, I would call reckless and a proximate cause of whatever harm was done. There is no way in hell that a secret experiment to make people depressed by manipulating their news feeds would pass an IRB, and Facebook should have been punished for it.
It’s a great example of why we don’t allow victims, or relatives of victims, to or
Jesus their own justice. Because they don’t care about distinctions that we, as a society, have decided are important.
Corporations should probably not enjoy that distinction in the same way that people do. The sooner we dispense with the harmful fiction of corporate personhood the better. Either way, let’s not conflate people and the law, with corporations and the law.
You might want to qualify this with the part of the world you're speaking about (especially given the article is on a German site). For instance, in the Netherlands, given a collision between a motorized vehicle and an non motorized (pedestrian or bike), the motorized vehicle is automatically at fault (for liability and insurance purposes etc).
How honest those efforts are is, in my opinion, worth questioning, but they are talking about it.
Yeah people claim that. But does the claim hold water? I think it was just used because it was a convenient way to communicate and if they didn't use facebook they would have used any of the thousands of other ways to communicate.
Facebook was zero rated with the state owned PTT - Myanma Posts and Telecommunications. The majority of people can not afford the style data plans people in the West take for granted. For many, their "internet" is largely confined to Facebook's walled-garden. So no they don't have "thousands of other ways" to communicate." Practically speaking the "convenient" way also happens to be the only way.
Maybe do a bit of research on the issue before summarily dismissing it.
When it's the only way to access the internet in an affordable manner, Facebook becomes the de facto authority in control of what readers have access to.
Curiously FB's Free Basics pulled out of Myanmar recently:
Did Boeing contribute to 911?
I don't know if that is or is not corporate negligence, but we should consider it so. It is detrimental to humanity, and they have continued to try to ignore the immense responsibility that the world now finds them in hold of. If they cannot very quickly act as responsible stewards of this immense power, they should not be entrusted with it. That is how government has always worked, and guess what! Making infrastructure public so that it can be regulated, monitored and managed for the public good turns out to be a good thing. FB is social infrastructure and needs to either accept and act on their deep responsibilities or cease trying to be social infrastructure.
Perhaps read the BSR Myanmar report and then read FB's own blog post where they agree with many of those findings in the BSR report.
Your comparison of Boeing and 911 is beyond absurd even as a strawman.
Power needs to be distributed as much as possible, so that no single entity can do a large amount of damage and have a large amount of leverage. In other words, Facebook only has as much power as the people using it let them have. The people need to become acutely aware of this and act accordingly.
That's why societies have laws. So we can know what to do or not when faced with such questions instead of passing the burden to the individual under the guise that it's "freeing people".
Except the elected representatives are dumb as rocks. I'm sure they studied how the cookie law would work really in depth before voting on it. That's why the cookie law worked out so perfectly, right? Oh wait, it didn't do ANYTHING other than cost companies millions of euros and wasted the time of the general public. It had ZERO benefits and a CS student could tell you that the way it was designed it could never have any benefits. I guess the only benefit of the law is to show that lawmakers continue to be incompetent at technology.
>That's why societies have laws. So we can know what to do or not when faced with such questions instead of passing the burden to the individual under the guise that it's "freeing people".
Societies have laws so that the regular person thinks that everyone plays according to the same rules, but that's not true in practice. Laws are intentionally written to be vague and not consistently enforced. This has led to a situation where there are so many laws that everyone breaks several every year and as a result if somebody higher up in the chain doesn't like what you're doing they can use the state to harass you with.
If laws were truly about knowing what to do then it would be impossible to graduate from mandatory education without having studied every single law that will apply to you in most circumstances. Furthermore, vague laws wouldn't be written and would get removed from legal codes. Furthermore, laws that are only selectively enforced would be removed, because laws can only be just if they apply to everyone consistently.
Yes, but laws can be devastating just as easily. In an age where the abuse and misuse of the ever-growing body of laws is so rampant, I cannot help to think that having even more badly written laws is a good solution.
Legislation is at best a necessary evil: a baggage, a clunky, heavy stone which you have to lug with you all day, every day. It serves a purpose, but you want to have as little of it as possible.
Indeed, the problem of coordinated action is so prevalent in any society, we have developed a host of tools to tackle it: morality and religion („don’t eat pork“), heuristics (“no free lunch”), and, yes, government (SEC, FAA, FDA,...)
The adjacency to free speech may increase the downside risk. But if the alternative is between government intervention and what is essentially “before flying, check that the pilot does not appear drunk”, the former seems to be the obvious choice. If the industry or the public wish to avoid that fate, they would have to come up with something in the middle: self-restraint (ha), or possibly a private certification scheme like MPAA.
It’s also notable that we do have examples very close to the issue of free speech that did not go down the slippery slope, namely public-airwaves television (were regulation is somewhat silly, but in no way restricting viewpoints other than sex being a healthy activity), and the aforementioned MPAa.
Perhaps we may not have the time, but I'm unconvinced it is so. There is also no need for every single person to change, just a critical mass and not necessarily drastically either. I propose that even some awareness of this issue in the majority would be enough.
It is also not unprecedented that the overall stance of society changes on some issue drastically over time. I'm thinking of ideas such as the abolition of slavery, women's suffrage, the acceptance of non-white people as equal members of society (yes, the change is not complete, but it is drastically different compared to a hundred years ago), the awareness of sexually transmitted diseases and so forth.
The awareness of the danger of letting any single corporation have too much power seems like a good addition to this class of ideas.
Sounds to me like you've accepted government tyranny as being the default. You also make it sound as though government intervention fixes all the problems - it doesn't. You STILL have to check whether the pilot is drunk before the flight and sometimes they still are drunk during the flight even with all of the regulations in the industry.
And either you never fly or frequently lie: there’s no chance they let you examine the pilot as a normal passenger on a commercial flight. Plus, being drunk was obviously just an example. Are you going to subject them to a three-day practical examination? And how are you going to provide tested pilots conducting those exams, without being lost in the recursion of your little populist fantasy?
That doesn't mean you can only choose between "anything goes" and "here's a list of government-approved algorithms, everything else is illegal". But it does mean that instead of individuals developing tools and heuristics to protect themselves, the government should mandate the creation of such tools.
That's essentially what GDPR is about. Instead of having to find out for yourself how your data is being used, the company needs to tell you. Instead of having to write a crawler yourself to get your data back out, the company has to provide it to you. And so on.
The power of making the final decision still remains with the user, but the government's job is to ensure that the decision is as easy as possible.
I agree with almost everything you wrote in your comment (including this quoted paragraph), but still maintain it is paramount that power be decentralized as much as possible. Similarly, that doesn't mean the only choices are between "power is completely decentralized such that no entity has greater power than any other" and "power lies solely within one central entity". Instead, the power of any government, corporation or organization should be kept bounded, with strict, irremovable limitations. The exact method to achieve this goal remains somewhat elusive, but I think the largest prosperity happens in those instances when it was achieved.
The trouble with solely relying on governments is scope creep. Seemingly inevitably the government eventually gravitates towards regulating more and more, until it is effectively prescribing an allowed list of algorithms, burdening the whole ecosystem in the process. You mention GDPR, but in order for the argument to stay balanced, in the same breath you should also mention Articles 11 and 13 of the EU Copyright Directive.
I also consider it essential for individuals to take matters into their own hands in some degree. They absolutely should develop tools and heuristics that help achieve the goals they want. Should some of the burden be taken away from the individual, by codifying tried and true principles into law? Of course!
As a closing thought, perhaps the model of the huge, centralized social network is ultimately a deeply flawed one and no amount of regulation can help it. Perhaps the end game is, once the average of all individual desires have been codified into law, that the traditional social network becomes economically non-viable. Centralizing all communication under a single large entity certainly sounds deeply wrong to me personally. And if it is so, the users will eventually realize this.
Except the government doesn't understand even the basics of how the internet works. And you want them to regulate the specifics of algorithms?
>That's essentially what GDPR is about. Instead of having to find out for yourself how your data is being used, the company needs to tell you. Instead of having to write a crawler yourself to get your data back out, the company has to provide it to you.
Here's an idea: DON'T GIVE OUT THE DATA IN THE FIRST PLACE. Ask for more controls over your data in browsers, because none of these rules are going to actually protect you. Yes, Google in Europe can't legally steal your data, but do you think a Chinese company cares? Of course not, because the EU does not have jurisdiction over them. All GDPR did was make European and above-the-board corporations and citizens worse off for some security theater, but they didn't actually deal with the problem.
GDPR has also made using the web frustrating due to every single website having a pop up now and some sites being outright blocked in the EU. GDPR is a terrible example of government intervention, because it shows that the people making decisions don't understand what they're making decisions about.
It’s very hard to “not give data out in the first place”. You don’t have to give it out, they’ll do anything they can to take it from you without asking.
I’ve recently had to email Airbnb to cancel my account becaue I don’t want to be tracked by Facebook and Airbnb put that tracking on every page logged in or not. Airbnb’s view is that I have to create a Facebook account and then set privacy settings there. They refuse to honour Do Not Track headers.
Airbnb won’t even let me login to delete my account. Instead I have email them and hope they’ll actually delete my data and account.
When you have companies making things as difficult as possible for users and using sophisticated tax avoidance to crush local competition on price you need legislation with teeth.
Facebook the company needs to be changed drastically by strong external forces (maybe governments, though the effects could include some harm too). It will not do anything on its own while Mark Zuckerberg and Sheryl Sandberg are stil at the company and/or have influence (which they will have, for a long time...at least the CEO will).
I do rue the dominant narrative in much of HN and the mainstream media, who is the more dangerous actor, FB vs NSA? My bet is on the NSA.
Is that supposed to be reassuring?? Increasing shareholder value for a lot of companies means destroying manipulating human lives
From one perspective, as far as pervasive mass surveillance and exchange of surveillance information are concerned, the U.S. in the Five Eyes group has more influence and impact than Germany in the Fourteen Eyes group.
It seems most can only imagine a government-led punitive solution. I would like to see more efforts (they can be government led) that educate users and encourage alternatives. You can't win going against the people's wants and unfortunately to many in the tech circle they don't recognize that FB is what the people want right now (it's not so "forced" or "they are too stupid" as many delusionally hope/believe).
PSA's, populace education/curriculum, grants, coalitions/communities, public-sector development, transparency requirements (w/out other large-sweeping data requirements for now), etc are all reasonable compared to existing efforts of large-scale-legislation/requirements + fines. We can get to where we want without large legislative internet interference that won't ever be reduced. It just takes people, the tech and journalism community particularly, to stop seeing FB as only a nail and the legal hammer as their only tool in the shed. It's like everyone forgets the failures of historical law-based attempts to curb each generation's bogeyman (alcoholics, communists, druggies, terrorists, capitalists, etc).
It's for courts to decide when it gets challenged, but the general idea is, the less obvious or expected the agreement is for normal usage, the less likely it'll hold. So burying some text that amounts to "we get to hack your device" is about as likely to hold as "we get your first born child" (which has been done multiple times by different companies, for fun, and no it doesn't hold).
I come to believe, therefore if the subject of a contract, in the facebook case information theft and violation of privacy, the contract can be nullified in favour of the citizen
Unconscionability in german law may occur in a Knebelvertrag (oppressive contract) or in contracts that abuse power differences between the contractual parties or even usury (immoral monetary loans for example). In that case the contract can become invalid.
In case of any disagreement, the spirit of a contract weighs harder than the wording in textform, if it is possible to determine.
A contract can also not place any part of the agreement outside the reach of a court (ie via arbitration clauses. You can still fight your agreement outside court but you can't forbid someone from going to court). You can't waive most of your rights either and even if you do, a court may determine that contract to be invalid and the other party having violated a right will then be prosecuted.
Contracts themselves are largely a specific case of a legal transaction (negotium juridicum) and aren't specifically regulated (all above is basically just about any legal transactions ranging from contracts over last wills to verbal agreements).
It'd allow for public scrutiny at the algorithms that run our lives. It'd essentially mandate thorough vetting of how anonymous and/or biased the data being fed into these algorithms is.
Which would be interesting
It’s rather well known, so you should probably read up on it before espousing strongly worded opinion weakly connected to the current state of the art.
The only way we can assure that cryptographic algorithms are in fact secure is to subject them to public scrutiny over many years.
Perhaps the only way to ensure that machine learning algorithms are not racist, harmful or biased is to have access to both the algorithm itself, the trained model, and the data that created it.
Algorithms can be biased. Data can be biased. Trained models can be biased.
We can't trust the government or companies to prove that a cryptographic algorithm is secure and we can't trust the government or companies to prove that an algorithm is unbiased and unharmful.
They intentionally use children to steal their parents' money.
I think Facebook (company) is lucky with Instagram because that's where the people are going right now.
Though I have to say that at least in its current form, the ability of Instagram to spread crap is limited
That proved to be untrue:
> But two new analyses of the Russian online propaganda campaign by the Internet Research Agency reveal that this view of Instagram was as rose-colored as, well, an artistically filtered Instagram post.
> “Instagram was perhaps the most effective platform for the Internet Research Agency,” states the report by New Knowledge, an American cybersecurity firm which analyzed data sets from Facebook, YouTube and Twitter.
> During the period studied by the report’s authors, IRA posts on Instagram garnered more than twice as many engagements (such as likes or comments) as IRA posts on Facebook – 187m on Instagram vs 77m on Facebook – despite the fact that Facebook offers many more ways for users to interact with content, and Instagram has no native “sharing” button to promote virality.
All the refrains that try to excluse/explain corporate misbehavior by claiming that that only duty of a corporation is to create value for its shareholders are only tempting a massive regulatory response in the future.
If capitalist ethics are unused to justify socially unethical and immoral behavior, don't be surprised when democracies start to turn against capitalism...
The lesson here is that you can’t trust large corporations to act ethically and if you can’t trust you need to regulate them.
(That a clothing company rather insensitively chose Banana Republic as a trade name sort proves that you can’t legislate against bad taste).
The "voting with your wallet" logic doesn't work very well with operations at the size and impact of Facebook. Zeit may be dependent on Facebook for profit. That's why some people ask for regulations. Companies should not be expected to act as moral agents.
You should also know that journalists writing for a paper has little to do with business strategies of the owners.
If the original title includes the name of the site, please take it out, because the site name will be displayed after the link.
I'd say that applies to author names, too. Besides, I think she's written the article in her capacity as minister of consumer protection, the fact that she's also simultaneously minister of justice isn't all that relevant.
For the love of god, enough already. This is mentioned on almost every Facebook story we get on Hacker News. It's becoming a meme! Look, barely 15 hours ago it was mentioned on another story: https://news.ycombinator.com/item?id=19004067
Can we please elevate the discussion around Facebook and stop mindlessly citing something Zuckerberg said when he was barely an adult? If you want to substantively critique Facebook, by all means do so. There's plenty of material to work with. You don't need to dig into the private conversations a college student had over a decade ago.
Please read and internalize this comment: https://news.ycombinator.com/item?id=19005009
If he has then we might indeed be able to right this off as a folly of youth.
If he hasn’t and he knows how it keeps being brought up it means something different.
What kind of person would refuse to give refunds when children get scammed out of money by games designed by professionals from the gambling industry to confuse and manipulate children? Somebody who thinks their users are "dumbfucks", that's who.
It's also inane to keep citing it, because there is nothing new to the comment. It brings absolutely nothing new to the discussion that any of us can dig into. It's just an opportunity to fire off yet another short, low effort comment about Facebook.
What drives me mad is that you don't need to cite this stupid quote to convince people Facebook has serious privacy issues. There's so much better material to choose from - more recent, more extensively documented, involving more people...
Yeah, it is.
Looking at the original goals and intentions that FB was founded around is indeed quite relevant to understanding their current actions.
Facebook was not founded around principles written in a private conversation. And even if it was, those principles are not a suitable lens for understanding the massive enterprise it has become. Would you like any company you found in perpetuity to be criticized based on a private chat you had on AIM two decades ago?
Judge Facebook's current issues according to substantive evidence we have in the present day. It's not like we're wanting for it! There's plenty of it, we don't need to rehash the words of a teenager who is now in their thirties every time their company pops up on HN!
I don't get why this is so hard to understand. You'd think I'm defending Facebook here. I'm not! I'm telling people to stop injecting noise into discussions about Facebook's legitimate privacy concerns.
Yeah, because that's relevant information, and that's how context works. We see this in courts of law. Why can't we look at history and consider the present at the same time?
When people invoke this quote they never examine the context. There's every possibility that this comment was self-deprecating as much as mocking his users, but that possibility is never explored.
People just deliriously portray him as this deeply odious Dr Evil-like villain in the making, and can't actually imagine a goofy 19-year-old saying goofy 19-year-old shit.
Facebook's "current challenges" are all rooted in their unethical CEO's psychopathic disregard for other people. Why was Facebook so careless(fraudulent) when they mislead news organizations about video engagement numbers, leading many news organizations to invest heavily in video production and consequently get cratered? Because Zuckerberg has little regard for other people.
That quote happens to be one time he said it about himself but he's proved it about himself numerous times in the subsequent years. A fish rots from the head; the trouble Facebook is constantly getting comes from Zuckerberg and the culture he has induced into the rest of the organization. It's the same story every time because the root of the problem is still there, unaddressed.
All the problem described in this thread could be solved with laws from 25 years ago.
Facebook stealing call and contact data? No, they can't do it. No, GDPR was not needed. No, putting a phrase in a ToS that nobody reads is not enough to save a company from being destroyed in court in case of serious wrongdoing.
Friendly fraud? Also a crime, just use the existing laws.
Calling the users dumb fucks? Not a crime.
Manipulating the users emotion or testing the algorithms on users? Also not a crime and I don't see why it should be.
Of course, it's much easier for the judicial system to not bother with those big companies by applying the existing laws. Why risk getting attention from such powerful people? And after all, if they were to apply the existing laws, then the poor politicians would have much less fake crisis to work with in order to expand the role of government and make money.
This is how the whole government thing work. Proved since the beginning of time. Trust them even less than Facebook.
And now I have it in several countries and various political systems. Some of which are totally incompatible with the one I was raised in.
All you need is consensus. And it is easy to get. No matter if its an ethnostate, a communist state, has a supreme ruler or prince able to override the legislature, representative republics elevated as bastions of the capitalist society, you name it.
It is so surprising and entertaining how the population of these countries get so caught up in their political system in the least effective way. The key distinction being they spend energy on the parts that have no consensus, like not even a little or half of their energy, ALL of their energy in politics is used on alienating themselves and everyone to support the fringes of their political party that doesnt have consensus and none on the part that has consensus, letting them get totally steamrolled by power hungry people and wondering “what happen”. They are the ones being played here.
Downvoting doesnt change that or offer another perspective.
Even if it's true that these are illegal, you're not going to be able to ascribe a "crime" to someone you can charge ex post facto. At most, you might see a lawsuit that results in a settlement for a small number of affected individuals, which is going to be accounted for already by Facebook's legal team in their decision to implement these polices.
We do need new laws here. The fact that Facebook feels so free to do these things means that they are unafraid of current law, probably with good reason.
The reason why FB and big companies in general are unafraid of current laws is that they know they won't be applied. It's a big hassle to punish the rich and the corporations, because people working in the judicial system just don't care about making big enemies. Also it's a pain in the ass because there are two thousand layers of limited liability they have to uncover before they can put the responsible people in jail. It's much easier to focus all the energy on some poor guy selling some weed or a small business not submitting the right form at the right time. Punishing them need zero effort, they can't defend themselves properly and their punishment justifies the work of the judicial system.
Elected politicians should be the one to keep the judicial system in check, but they don't have an incentive to do so that it starts to punish rich people FIRST.
In the end the people, by showing support for "more laws" they only get more laws, which will cost more money to the taxpayers and also will make life difficult for the small business while big business won't care and will even be advantaged by them. All this aside from the fact that it doesn't make sense to make another law when there is already a law.
Facebook is a private company. It will do its best for its shareholders.
And as best as I can tell, what this german minister of "justice" and a segment of the politcal and business elites are complaining about is that they don't get to control facebook. These authoritarians want to control what people read, what people say and how they think. No different than saudis or the chinese or the russians complaining about facebook.
If you don't like facebook, stop using it. If germans don't like facebook, create a german version of facebook. Frankly, the germans should be more worried about the nazi/stasi-esque german intranational spying. It's strange how the people complaining the most about facebook and its privacy violations/spying are also the leaders in spying on its own citizens. I doubt even china spies on its own citizens as much as the british and the germans do.
That's moving the goalpost to a subjective location. But if you never saw intense criticism of Zuckerberg and his company on HN before the 2016 election outcome, then you just weren't paying attention. The "dumbfucks" comment was what, a decade ago?