I get that people want to see a company crippled but wiping out a quarter of revenue is a death-blow.
I don’t use Facebook at all, but I followed the CA scandal because I thought it was an interesting case that Facebook predictably botched nearly every aspect of handling it.
$5 billion is a lot of money. It is not and never can be considered “chump change”. It is irrational to think a Board or management team isn’t going to seriously implement rigorous changes to avoid a $5 billion dollar fine.
This is where someone says that if you can violate the law and earn $6 billion and only be fined $5 billion you will violate the law every time.
There are numerous problems with this statement. One, you don’t know ahead of time what the fine will be. Two, it assumes there’s no way to earn even $1 of the $6 billion without violating the law.
If you can earn $3 billion without violating the law and double it to $6 billion with the violation, but the fine is $5 billion then it’s not “worth it” but even still this fails to account for the tremendous reputational and legal risks.
One parting thought. If the behavior which is being fined was directly used to obtain a monopoly position in the marketplace which the company continues to benefit from, that changes the calculus. But was CA’s ability to scrape social network data from Facebook’s API a proximate cause of Facebook’s market position?
Nobody is interested in giving FB a warning. If you ask me, prison time for execs in addition to at least $10B yearly fine makes sense. They intentionally did things that harmed millions of people, the idea is not to just make them think twice before they do something like that again,but to make the message loud and clear: "that move is a fatal move". Not just for FB but for all others who make money out of user surveillance (snapchat,linkedin,google,etc...). We don't want to give them a risk they can calculate and evaluate,we want to give them a risk as clear as death, a risk no company will even bring up in any meeting.
This was not a secret agreement with CA, this was CA using a publically documented API which was returning the data it was designed to return, but CA using and retaining the data improperly.
Later Facebook decided to significantly restrict the data that apps can pull out of their system (arguably making the system more closed and harder to compete with, but protecting that data from misuse).
They exposed too much of the social graph through their developer APIs. I can see an alternate universe where they are fined for exposing too little.
Like I said, I don’t use Facebook so I don’t care much about them. I don’t think they add a lot of value in the world, but their billion plus users must feel differently.
But I don’t understand why, on the basis of Cambridge Analytica, the company should be fined out of existence.
You can hate Facebook, and you can not use it if you do hate it. But there are a billion plus people who apparently don’t hate Facebook and might be damaged by Facebook being dismantled by the government.
So you can "not use Facebook", but you can't not BE used by Facebook. Which means you also can't dismiss them as a choice. They are imposing themselves on you, whether you like them or not.
If billion+ users use/like FB, it means you are surrounded by them _everywhere_ you go! Its like saying Im fine living in a mental asylum b/c im not the crazy one there!
We all _are_ FB users whether we have an account or not!
They trick users to give them their email password, then use that password to login and steal all your contacts.
That one that should probably be criminally charged.
I think it’s important to be rigorous and even handed in calculating these fines. Part of that is identifying precisely what action(s) are being called out, and how the fine is calculated. I disagree with many of the EU fines, but I do think they are fairly methodical about how they calculate the amounts. ‘We don’t like you, and you do lots of other bad stuff’ is not a valid methodology.
Repeat offender certainly is justification for multiplying the fine, but at this point in this particular case that’s jumping the gun.
In the case of verifying a user’s email by asking them for their password, that seems to have been an egregious mistake that was quickly rectified once it was called out, and although certainly they could have used that access to pilfer contacts (and a whole host of other personal info) I don’t believe anyone has claimed they actually did that, and they said definitively that they did not. For what that’s worth.
As humans we are certainly willing to give others extra chances to do right by people, but at some point people realize that the bad actors will never rectify their behavior and should be stopped. I'd personally vote for jail time for executives.
I'm with you that it shouldn't kill them, but it needs to really hurt so they take better care (and not do illegal stuff in the future). I don't think something like this does. It's annoying, yes.
Consider a similar scenario where a food or drug manufacturer doesn't check their raw materials and poisons a bunch of people. Should their supplier make sure that they don't sell them poisonous stuff? Absolutely. Do they themselves bear the responsibility to make sure that nothing bad happens? I believe so, so by skipping their own checks to save a few bucks, it's their fault. And you can be quite sure that in such a case, there won't just be a small fine - there will be a fine, and production halt with a lengthy and expensive re-certification, because the processes they had in place are obviously flawed. Yeah, FB doesn't produce food, so those regulations don't apply to them. I don't really subscribe to the theory, but there are many who believe that they have the power to sway elections - that certainly puts them in the same ballpark as the company that provides your aspirin or pizza sauce.
Risk takers and driven people aren't thinking about punishment. By definition. But that drive can be channeled in positive directions. Punishment is really a pretty hopeless method of doing it.
Its why bombing random goat herders for 20 years hasn't reduced the number of people who want to blow themselves and others up.
The focus on punishment is a distraction. It just makes everyone involved channel their energy into playing defense and that delays the real fixes.
Focus should be on getting them to clean up the mess, rather than sending them to jail. Social media and the news media need a whole lot of re-architecting. And the focus really should be on what that new architecture needs to be.
That's why for FB the downfall is not yet really visible - might take another 12 months.
On a side note. Big internet companies will never enjoy the same level of political support. They don't support any job and don't pay any tax where they get their money from. It's a bunch of nerds in the west coast and that's barely caricaturing. If anything, the bigger fines will come from Europe rather than Washington, because politics.
Instead, you need to:
1) hold the C-suite and the Board accountable for the crime(s)
2) eliminate all their compensation from the years they were committing the crime
3) force the company to replace the entire C-suite
Fining someone like Zuckerberg and letting him retain control is like finding out your babysitter is giving your child alcohol, fining them, and then letting them continue watching your child. It makes no sense in any other context.
You think they don't have water cooler conversations that include statements like, "but we can work anywhere after working here."
What harm will come to all those employed accomplices to these crimes against humanity?
What do we think would change if "Facebook" as a company went bankrupt? Certainly, the website would continue operating - it's profitable. The technology is still worth whatever it's worth. I have a hard time imagining that a purely financial threat to the corporate entity changes the product too much.
What it does change is the incentives to the capital involved. It threatens equity. If we all agree that the Facebook "tech" is morally neutral(ish) then the purposes its being put to are what we should challenge. Breaking the back of the company's financials seems like a reasonable way to do that.
If the company that operates those servers gets shut down, I don't see how the group that takes over the business (whoever it is) shuts down the servers. Maybe everyone freaks out and Facebook loses a lot of people, maybe we don't get updates for a bit. Those seem possible, probable even. But things that make money don't get taken apart in bankruptcy.
I could be wrong! But I don't know of any historic examples where things like that happen.
Yeah, so? Smaller companies get death-blow fines all the time, why shouldn't big sharks get the same?
Well, seeing that the board is powerless since Zuckerberg has a controlling interest in FB....
No, it's not. 15 billion is less than 3% of Facebook's market cap. It's less than the swing in Facebook's value after market close on Wednesday. Also probably less than the value they got through violating their consent decree.
Look at it from the perspective of existing investors: They would be forced to buy the new shares Fb is issuing in order to maintain their respective percentage of shares and, thus, their expected future dividends. There's nothing to gain from that – in fact, their investment's expectation value just goes down as they now need to pay an additional price for the same expected future return. Needless to say, shareholders don't value such a move, so any CEO trying to pull off such a thing would likely be removed immediately.
What about not getting shut down for failing to pay a fine?
But even in these situations, I think a case can be made for why refinancing the company by having it take a loan is still more attractive. While from the perspective of an existing shareholder it will reduce future dividends in a similar way as new stock, there will at least be no opportunity costs. (If the company issues new stock, the investor has to consider what else they could do with their money instead of throwing fresh money at the company.)
So is a lengthy prison sentence for individuals. Should we get rid of those to make it fair?
> There are numerous problems with this statement. One, you don’t know ahead of time what the fine will be.
You can take an educated guess going off of what happened to you and others on previous occasions. If there are no examples of a fine that is "a death blow", your risk to receive one is low. Hell, you might not even get caught. It's hard to tell how much they get away with because there's no leak or whistleblower.
I'm pretty sure that the reason FB isn't putting bogus charges on users' credit cards isn't that they wouldn't because of ethical reasons. It's that a) they'd likely get caught and b) they'd get into a lot of trouble.
Companies aren't sex offenders or committing crimes of passion, they are much closer to rational actors, deterrence works great on them.
Secondly, FB might amplify our flaws but this is a structural issue contingent on the very design of the newsfeed algorithm. Instead of blaming ourselves and embarking on a nationwide self improvement campaign we can think about minor design changes that could perhaps alleviate these effects.
The fact that the mentality has changed from "let's improve ourselves" to "let's legislate and limit" is part of what makes this debate so scary; this is not a new opinion for me however, I certainly tend to fall on the side of self-empowerment and positive (as opposed to negative) liberty in most cases and see facebook as a lesser of MANY evils of which the population/media has neglected. (Credit card data exchanges, NSA bulk collection, at&t room 641a, equifax, etc. This is far from the first time I've made a similar ramble to this)
Especially given that I tend to agree with the OP that it's a manifestation of our "lizard brain" that's actually the vulnerability here, and that govt. doesn't have a great history of managing these sort of things (Prohibition, sex work laws, drug war) I'm very remiss to give them more a hand in controlling the people.
ESPECIALLY given that, frankly, the way this is being legislated will leave facebook with exclusive control over their graph, as opposed to democratizing and making people aware of it. Even GDPR, for instance, will allow FB to compute aggregate and trend statistics on the graph data, which will be sufficiently deanonymized while preserving the bulk of the "insight" data they and only they gleaned.
I said a lot of things here and probably undermined my own point with some of them, but broadly, I think there's both evidence that it is a scapegoating, and that there are major pitfalls in how we're trying to address it.
I think Facebook demonstrates to us just how much of slaves we are to psychological manipulation. To me it all boils down to the same fundamental human flaw, which is addiction. Corporations have been taking advantage of alarmingly simple ways to lure us into consuming all sorts of products (tangible or not) for a long time now.
Since this has been going on for so long, I think another obstacle is that in acknowledging this flaw in our behavior, we will also be forced to confront other flaws of our society that are entangled within.
It's easier for people to blame a non-human entity or the Zuck(..?), than to confront that they need to make changes that will affect their/our lifestyle.
We all need to stop outsourcing the blame and make the individual efforts to change ourselves instead of wasting energy fighting this perpetual battle. We each have little to no control over the former and all the control over the latter. Wake up.
Well, part of action to change yourself is to change your environment, so that you don't have bad habits within easy reach.
A huge fine to take down Facebook helps with that, and sends a strong signal to other social media takers...
It's not like "change yourself" only has to be some "from within you", will third parties are allowed to push lures straight down your face all the time...
Smoking wasn't stopped by just people smoking less, but also by big fines for tobacco, extra taxes, and so on.
One large issue with these particular "addictions" is that we are hooked on them without being aware of their addictive nature. Whether or not the intentions were malicious are certainly something that needs to be addressed but the obstacle remains, which is that we have to individually endure the difficulty of withdrawal to break out of the cycle.
And that really is the case, it's withdrawal and withdrawal sucks. A lot of us already live demanding lives, experience depression/anxiety on varying but significant scales and don't really have the energy to devote toward the will-power that is required when distancing oneself from an addictive behavior. And it does require energy.
Of course preventing this on a large scale is a necessary goal, but we can make instantaneous changes now. Baby steps. I didn't delete my Facebook but I blocked the news-feed, uninstalled the app and hardly ever use it other than to see if people are trying to contact me.
The way to punish Facebook is for users to stop using it; that most people don't, tells us a lot. Follow the data and behavior, not the speech and statements.
If Facebook died we would all land on another platform, it's just very hard to move people over iteratively. The government could push that transition along, if it wished.
Our true shadow is not what Facebook exposes. Rather it collects data and exposes a subset which is in fact the anti-shadow.
YOU choose what data to expose on facebook. The act of that data being exposed has effects that are tangible.
If you've read Foucault he describes a similar effect during the Victorian era on human sexuality -- the idea that by exposing something and quantifying it -- you in fact put a sort of control on it. When it was hidden, no one knew what was normal -- so they used their intuitions and personal judgement.
It doesn't share everything -- because it can't possibly KNOW you (or your shadow) -- people are incredibly smart and complex -- as you point out. But what it does do is expose a superficial subset of you that in turn changes your behavior and the perception of your behavior.
Jungs whole point is that the shadow is an intrinsic aspect of any social society -- something MUST be repressed in order for us to not be completely individualistic psychopaths (like the snakes we evolved from) ... simply exposing this duality doesn't just "fix it" ... instead society needs to function with the knowledge that these things exist. Jung's point is that the individual should not go into psychosis trying to balance dualistic morals... and society shouldn't be so harsh in judging morals (NOT that everything in the shadow should be made public, or even that the shadow is a bug rather than a feature)
Yet, to keep the pernicious effects of those flaws from being amplified by that tool at scale, it needs to be kept in check. Smoking habit may be a reflection of poor self-control on the part of the smokers, but that is no defense against Philip Morris fines.
Also Philip Morris (at least the parent company) is now called Altria
This seems to call for a balance of perspectives and responsibilities. Partly similar is probably the case of traffic safety: driver's vigilance and responsibility has to be matched with sound regulation of all transportation profiteers.
People are only smart some of the time, everybody can't be smart all of the time. If I followed you around for a week I'm sure I could find multiple instances of you doing something the audience would consider 'very stupid'.
>facebook is just another tool to expose our animalistic flaws
Right, much like heroin exposes our animalistic flaws of addiction, we don't allow companies to profit from it by using it as a tool.
I wouldn't mind beheading and burning this particular scapegoat on the altar of privacy, though, if it helps us ponder a little more about our own self-contradictory nature. If scapegoating is inevitable, I would much rather live in a society that uses legal fictions such as corporations as its scapegoats than one that beheads and burns actual human beings.
I have always though for example that here in Mexico, traffic fines should be a % of the "yellow book value" of your car. Say, for passing a light in red, you get 5%. If you have a $1600 VW bug, then 5% of that will deffinitely hurt you. Similarly, if you have a $256,000 Ferrari, 5% will still be meaningful for you.
Otherwise, fines end up being just "the price of making business" or the price of speeding for wealthy people.
Exceed some small demerit points limit over a defined time period and your can be ordered into driver school or have your license suspended.
With the points system, a "rich dude" who see's a $500 red light fine as pocket change can't simply run red lights as long as they see fit. If the monetary fine is not itself a deterrence, the demerit points then become the deterrence.
Or maybe we could also hand down prison sentences like that. If you're young and steal a candy bar we can send you to jail proportionally longer than if you're old.
Of course none of this makes sense, and fines based on your income is a terrible, awful idea.
Fines are meant as a deterrent. Flat fines are either devastating for poor people or meaningless to rich people, or both.
Fines that scale with income or wealth allow reasonable deterrents to rich and poor people alike.
No they're not. They're meant as a price. If they were meant as an unconditional deterrent then the penalty for everything should be death plus all of your assets.
That would maximize deterrence (and require us to be a lot more careful about what we pass laws against), but that's not what we're really after.
What we're trying to do is to keep the bad thing at a manageable level, which is exactly what proportionate penalties do, especially fines.
There are only a few crimes where the most important thing is deterrence, but those are the things we don't use fines for to begin with. For example, there is no fine for murder, the penalty is death, or life in prison, because we really are out to maximize deterrence there.
For everything else, if the fine is set appropriately then the cost to society of someone violating the law is less than the amount of the fine times the probability of being caught. If you then want to pay the price to do the thing, great -- we'll take your money and use it to save some lives somewhere else or do some other socially beneficial thing, and since you're paying more than the cost of the damage you're doing, everybody comes out ahead.
And if the fine isn't high enough to pay for the damage being done then it should be higher for everyone.
Because things cost money and if you want indulgences you have to pay for them.
The only way it's a problem is when the value is set wrong, so that you're only paying $100 but causing $1000 in damage. But that's not the fault of the premise of pricing damage, it's the general problem with corruption or government inefficiency. And the alternative is that the corruption causes you to be able to do $1000 in damage and pay nothing because the whole thing is swept under the table -- at least this way it's happening in the open and the public can evaluate whether the price is appropriate.
What if the harm to society of a rich person breaking a law is more than the harm of a poor person breaking that law? The positive actions a person does may be magnified by wealth; why wouldn't the negative actions be?
It depends on the state. In Missouri, 2nd degree murder has a minimum of 10 years.
Don't we do that somewhat based on income, only it's reversed? If you're poor, you get a harsher sentence - we might as well make it official.
Unfortunately, they aren't actually looking a your assets and income, but make a guess and if it's too high, you can provide proof that it's not accurate. It's usually far too low when you're in the top 5%.
Wisdom of the policy aside, why is it implemented in such a seemingly insane way? Why not just look at the equivalent of last year's W2? While not perfectly representative of steady-state income, it's what many other income-based systems do and you can easily have a petition process for claiming that that year's income is unrepresentative.
Publicly traded companies are supposed to be responsive to shareholder concerns, and shareholders would be terrified of penalties denominated in stock. Pittance dollar bill fines are unlikely to ever work because finding a "just right" size is too difficult.
What's clear is that they are blaming facebook for political outcomes that are divergent from the mainstream and are conflating that with all that is negative and upsetting in the zeitgeist
In reality, any industry populated by humans (perhaps excluding those with extreme barriers to entry) is going to have people on most points of the spectrums of stupidity/intelligence and honesty/dishonesty. There's no bar that means you have to be particularly smart or intellectually honest to be a journalist.
The upshot of this is that when you see issues that systemically affect an entire industry, you can expect to see this bias to a pretty blatant degree. A banal example is that of Google Reader: its loss was perhaps worth bemoaning, but the sustained hysteria from the media was insane (hilariously, this was around the same time they were pronouncing Google+ as dead, despite it having a multiple of in-stream DAU compared to Google Reader).
So, to finally get to my point: the upheavals that the media industry has been going through the past decade is (fairly) blamed on tech and (perhaps less fairly) personified by large tech companies. That doesn't mean dismissing any criticism of tech, as there are plenty of valid ones, but reading coverage critically is always valuable, and particularly so on topics where they have an incentive (emotional or otherwise) to dissemble.
 or even statistically: the claims of both pro-Democrat and pro-corporate bias in large media are IMO exaggerated but are also likely true to at least some degree: they're an unavoidable consequence of the industry being heavily skewed towards people/entities identifying as Democrats working for/being large corporations.
And that can be done easily. The reason it's impossible to compete against Facebook is because of the network effect. Kill the network effect. If you have a site with more than let's say 100 million active users you are required to license free user generated content (such as posts) under a copy-left license, unless the user opts in to a restrictive license on a per content piece. All content, opt in or out out, must regardless be treated the same by the site - no incentivizing people into opting in by effectively punishing those that don't.
And furthermore require that the site provides both an API for accessing all published content, as well as remotely publishing by users. You've now not only 'punished' facebook, but really fixed a huge chunk of the entire tech industry's problems by killing the network effect. Now competitors can create their own alternatives that users from e.g. Facebook can use and even keep and interact with all of their friends on. With no network effect, it's now simply a competition to see who can provide the best user experience.
Now contrast the messy science of climate change with very controllable A/B testable situation with privacy violations and echo chamber amplification and large scale social engineering that is happening with Twitter Facebook etc. I would argue that they know exactly what they are doing and choose to continue only because... profit.
Fines? Come on. Were there justice we'd have people shot dead in the square then buried unmourned and in unmarked graves for what they did to us as a society and as a race. Continuing to assess fines as the magnitude of their betrayal of society and of the human race would be getting off light.
Because here's the thing: if they weren't sure in the seventies they were damned well sure by the eighties and they're damned well sure now and they still fight to put profits over the future of the human race. They are the counterpoint to the ghastly unfettered capitalism that some people still think is a good idea and they're going to end billions of humans because of it.
"Not all oil companies"? Get out with that. There is no exclusion from this betrayal of the human race and there is no oil company in existence today not blighting us by their continued operation. If some hypothetical oil company doesn't want to be strung up? I dunno. How about 100% renewable energy by 2035--which is probably still too late, because we are screaming over the cliff, but hey!--and externally-validated commitments to that, with failure to meet those commitments resulting in revocation of their corporate charter, forcible seizure of all assets, and jail sentences long enough to see the sea rise they caused for their executive tier? That seems equitable to me.
Sigh. A violated society should act, but we are passive.
I wish one day real journalists will look into how the "anti-facebook" media blitz happened. I've never used facebook and I hate the idea of facebook, but the politically motivated attacks that seem to stem from the highest level of governments ( across the atlantic as well ) and amongst the media need real investigating.
At this point, it isn't objective reporting, it's ideological advocacy grounded in political motivation. From zuckerburg and sandberg being accused of being antisemites ( when they themselves are jews ) to facebook being blamed for everything under the sun, one has to wonder what the hell is going on. There definitely is a story here, but it isn't facebook, it's the media. I doubt Mao's propaganda against the bourgeois was as relentless.
Also, facebook and news companies are both in the same business - the ad selling business which is the cause of the privacy issue.
But that's not my concern. My concern is the obvious politically motivated move against one company and one industry by politicians, government agencies and news companies. When people start calling zuckerburg and sandberg ( who was hailed as a female hero by the media for many years until recently ) antisemites, you have to start asking question. No?
Don't you think it is odd that news companies ( who are supposed to be objective and report the news ) are advocating for fines? Shouldn't a news company simply report the fine rather than demanding more fine? Where's the objectivity? Shouldn't journalists report what is happening rather than advocating for something to happen?
I'm for reporting on facebook and privacy. It's an important issue for sure. What I also want is reporting on the reporting itself because that seems to be an even bigger story. I may not be the sharpest tool in the shed, but even I can see that there is more than reporting and news involved here.
And it's hard to take the news concerns about privacy seriously when they are all demanding preferential treatment on facebook's platform. Why don't the nytimes close their facebook account? Why don't they pull their headlines from facebook rather than demanding facebook prominently display their headlines and give their headlines top billing? When their words and actions don't align, it makes me wonder. I'm always wondering.
You make Facebook and all social media content fall under the telecommunications act. This would mean that FB would lose its network effect (other parties would be able to provide services that interact with users on FB; just like telephone users can call eachother even when they are on a different network).
In general, do things that aren't cost effective or efficient for companies that haven't misbehaved but make a very effective combination of punishment and reform.
But that's assuming the goal is actually reform rather than just lining government coffers.
I mean, here's the thing. If you spit on the sidewalk or look funny at a cop, you can spend months in jail. Facebook is doing some bad stuff to a whole lot of people, all the time, flaunting the law, without remorse and repeatedly. There is absolutely no reason why considerations like "we must make sure Facebook survives as a going, profitable entity" should enter into punishment considerations, any more than "we should make sure this guy's life isn't harmed" enters into our punishment considerations for some guy walking down the street with two joints in his pocket.
Cost of doing business for large international corps.
An actual decentralized network has different requirements and can thereby attract users by having different characteristic advantages.
For example, one of the large and growing problems with centralized networks is that everybody leans on them to do censorship, and they have insufficient stake in protecting individual speakers, even against unreasonable or blatantly fraudulent censorship requests.
If you make it trivially easy and inexpensive for anyone to self-host then the person making the censorship decision is the person whose content it is to begin with, which brings back the proper alignment of incentives regarding cost of removing content vs. liability for hosting it.
And that's just one advantage. Another is you control your data instead of Facebook and it's only you and your own friends who have access to it, not some corporation using it to run adversarial machine learning against you. Another is that your data is local and is therefore faster and available in places with poor internet. Another is that you have access to it yourself and can make modifications to both code and data, and then you get improvements from everyone instead of being restricted to a single company with a conflict of interest.
These kinds of things add up, but first you need somebody to spend the resources to develop it into something that non-experts can easily use and understand. Which is already happening, slowly, but would happen much faster with those kinds of resources.
Unfortunately I don't see any other way for those thugs (FB) to come clean.
More realistically, the board and executives should be personally liable for a fraction of their wealth. Fine Zuckerberg and Sandberg 10% of their hoards, and things will change.
What they want is to line their own pockets, they do that by fining them in a way the stupid uneducated public supports.
The real way to 'punish' Facebook would be to hold Zuck directly accountable. i.e he personally faces jail time unless he does x,y and z etc.