Hacker News new | past | comments | ask | show | jobs | submit login
If a $5B Fine Is Chump Change, How Do You Punish Facebook? (nytimes.com)
97 points by pseudolus 81 days ago | hide | past | web | favorite | 126 comments

> What does a good, meaningful fine actually look like? Would it need to wipe out one quarter’s revenue (roughly $15 billion)? Is a fine meaningless unless it’s recurring (penalties for every quarter or year since Facebook last settled with the F.T.C. for violating user privacy in 2011)?

I get that people want to see a company crippled but wiping out a quarter of revenue is a death-blow.

I don’t use Facebook at all, but I followed the CA scandal because I thought it was an interesting case that Facebook predictably botched nearly every aspect of handling it.

$5 billion is a lot of money. It is not and never can be considered “chump change”. It is irrational to think a Board or management team isn’t going to seriously implement rigorous changes to avoid a $5 billion dollar fine.

This is where someone says that if you can violate the law and earn $6 billion and only be fined $5 billion you will violate the law every time.

There are numerous problems with this statement. One, you don’t know ahead of time what the fine will be. Two, it assumes there’s no way to earn even $1 of the $6 billion without violating the law.

If you can earn $3 billion without violating the law and double it to $6 billion with the violation, but the fine is $5 billion then it’s not “worth it” but even still this fails to account for the tremendous reputational and legal risks.

One parting thought. If the behavior which is being fined was directly used to obtain a monopoly position in the marketplace which the company continues to benefit from, that changes the calculus. But was CA’s ability to scrape social network data from Facebook’s API a proximate cause of Facebook’s market position?

Here is what you dont get,the idea is to give them a death blow.

Nobody is interested in giving FB a warning. If you ask me, prison time for execs in addition to at least $10B yearly fine makes sense. They intentionally did things that harmed millions of people, the idea is not to just make them think twice before they do something like that again,but to make the message loud and clear: "that move is a fatal move". Not just for FB but for all others who make money out of user surveillance (snapchat,linkedin,google,etc...). We don't want to give them a risk they can calculate and evaluate,we want to give them a risk as clear as death, a risk no company will even bring up in any meeting.

Maybe I’m out of sync with what really happened with Cambridge Analytica? As I understand it, CA scraped Facebook and retained the data in violation of Facebook’s ToS.

This was not a secret agreement with CA, this was CA using a publically documented API which was returning the data it was designed to return, but CA using and retaining the data improperly.

Later Facebook decided to significantly restrict the data that apps can pull out of their system (arguably making the system more closed and harder to compete with, but protecting that data from misuse).

They exposed too much of the social graph through their developer APIs. I can see an alternate universe where they are fined for exposing too little.

Like I said, I don’t use Facebook so I don’t care much about them. I don’t think they add a lot of value in the world, but their billion plus users must feel differently.

But I don’t understand why, on the basis of Cambridge Analytica, the company should be fined out of existence.

You can hate Facebook, and you can not use it if you do hate it. But there are a billion plus people who apparently don’t hate Facebook and might be damaged by Facebook being dismantled by the government.

Facebook is a strange beast when it comes to privacy. According to themselves, they have what amounts to hidden profiles for people who aren't part of Facebook. In other words, they track you and try to learn about you, even if you actively avoid them.

So you can "not use Facebook", but you can't not BE used by Facebook. Which means you also can't dismiss them as a choice. They are imposing themselves on you, whether you like them or not.

Most of the privacy-violating industry works the same way, they're just better at keeping it out of the news. The NSA, the FBI, all 3 major credit-reporting bureaus, any security company with CCTVs, Google, ad-trackers, 23andMe, anyone who buys or sells your mailing address - they all keep data about you whether or not you have chosen to allow them or have any business relationship with you at all.

It should all be shut down down. Some kind of reasonable privacy should be a right.

Not to defend FB, but many trackers by companies work this way. They build an anonymous profile based on device fingerprint and then merge it if you sign in as a user.

>Like I said, I don’t use Facebook so I don’t care much about them. I don’t think they add a lot of value in the world, but their billion plus users must feel differently.

If billion+ users use/like FB, it means you are surrounded by them _everywhere_ you go! Its like saying Im fine living in a mental asylum b/c im not the crazy one there!

We all _are_ FB users whether we have an account or not!

what crazy is depending on perspective isn't it ? from the perspective of billion+ users, you are the crazy.

You need to catch up on the latest scandals involving Facebooks business practices, including phishing attacks.

They trick users to give them their email password, then use that password to login and steal all your contacts.

That one that should probably be criminally charged.

I totally agree there are other issues that have come to light. But those aren’t in any way part of the government calculation of the fine in this case.

I think it’s important to be rigorous and even handed in calculating these fines. Part of that is identifying precisely what action(s) are being called out, and how the fine is calculated. I disagree with many of the EU fines, but I do think they are fairly methodical about how they calculate the amounts. ‘We don’t like you, and you do lots of other bad stuff’ is not a valid methodology.

Repeat offender certainly is justification for multiplying the fine, but at this point in this particular case that’s jumping the gun.

In the case of verifying a user’s email by asking them for their password, that seems to have been an egregious mistake that was quickly rectified once it was called out, and although certainly they could have used that access to pilfer contacts (and a whole host of other personal info) I don’t believe anyone has claimed they actually did that, and they said definitively that they did not. For what that’s worth.

"ToS violation" is a lame copout, especially since the CA events happened long after the 2011 decree https://www.ftc.gov/news-events/press-releases/2011/11/faceb... . As a former president said, "There's an old saying in Tennessee — I know it's in Texas, probably in Tennessee — that says, fool me once, shame on — shame on you. Fool me — you can't get fooled again"

As humans we are certainly willing to give others extra chances to do right by people, but at some point people realize that the bad actors will never rectify their behavior and should be stopped. I'd personally vote for jail time for executives.

> But I don’t understand why, on the basis of Cambridge Analytica, the company should be fined out of existence.

I'm with you that it shouldn't kill them, but it needs to really hurt so they take better care (and not do illegal stuff in the future). I don't think something like this does. It's annoying, yes.

Consider a similar scenario where a food or drug manufacturer doesn't check their raw materials and poisons a bunch of people. Should their supplier make sure that they don't sell them poisonous stuff? Absolutely. Do they themselves bear the responsibility to make sure that nothing bad happens? I believe so, so by skipping their own checks to save a few bucks, it's their fault. And you can be quite sure that in such a case, there won't just be a small fine - there will be a fine, and production halt with a lengthy and expensive re-certification, because the processes they had in place are obviously flawed. Yeah, FB doesn't produce food, so those regulations don't apply to them. I don't really subscribe to the theory, but there are many who believe that they have the power to sway elections - that certainly puts them in the same ballpark as the company that provides your aspirin or pizza sauce.

The problem with this argument is there is no great evidence it has any effect. Definitely doesn't on the next bunch of 20 year old Zuckerbergs that comes along, not knowing what the fuck they are doing and doing it anyway.

Risk takers and driven people aren't thinking about punishment. By definition. But that drive can be channeled in positive directions. Punishment is really a pretty hopeless method of doing it.

Its why bombing random goat herders for 20 years hasn't reduced the number of people who want to blow themselves and others up.

The focus on punishment is a distraction. It just makes everyone involved channel their energy into playing defense and that delays the real fixes.

Focus should be on getting them to clean up the mess, rather than sending them to jail. Social media and the news media need a whole lot of re-architecting. And the focus really should be on what that new architecture needs to be.

Remember when VW did all that computer magic with emissions controls. Only a mid level engineering manager went to jail. The diesel emissions will probably cause cancer in people in the streets. CEO's need to start going to the gaol long term , and we need to start banning those involved from being able to run, own, or hold shares in corporations outside of blind trusts ever again.

Nope, investigations just take really long and courts are slow.

That's why for FB the downfall is not yet really visible - might take another 12 months.

The car industry support hundreds of thousands of jobs all over the world and in the respective countries of each manufacturer. There would never be any strong action taken against them, this would be hurting the government who took it and its people.

On a side note. Big internet companies will never enjoy the same level of political support. They don't support any job and don't pay any tax where they get their money from. It's a bunch of nerds in the west coast and that's barely caricaturing. If anything, the bigger fines will come from Europe rather than Washington, because politics.

60 million advertisers use Facebook to run ads...I am pretty confident that they have a strong economic interest in the government being nice to Facebook.

I do think there should be a "death penalty" for companies, but you don't need to punish the company (and many innocent employees) to prevent unethical behavior.

Instead, you need to:

1) hold the C-suite and the Board accountable for the crime(s)

2) eliminate all their compensation from the years they were committing the crime

3) force the company to replace the entire C-suite

Fining someone like Zuckerberg and letting him retain control is like finding out your babysitter is giving your child alcohol, fining them, and then letting them continue watching your child. It makes no sense in any other context.

And then only after implementing this set of policies do you realize none of these board or executive teams are located within your borders anymore.

Doesn't that help solve the problem then?

No one who works for any of these companies is innocent.

You think they don't have water cooler conversations that include statements like, "but we can work anywhere after working here."

What harm will come to all those employed accomplices to these crimes against humanity?

^ this is your brain on NYT

> I get that people want to see a company crippled but wiping out a quarter of revenue is a death-blow.

What do we think would change if "Facebook" as a company went bankrupt? Certainly, the website would continue operating - it's profitable. The technology is still worth whatever it's worth. I have a hard time imagining that a purely financial threat to the corporate entity changes the product too much.

What it does change is the incentives to the capital involved. It threatens equity. If we all agree that the Facebook "tech" is morally neutral(ish) then the purposes its being put to are what we should challenge. Breaking the back of the company's financials seems like a reasonable way to do that.

Their data centers can be seized and net connections cut.

But like...why would bankruptcy do that? The servers aren't worth that much. Their contents are worth more secret! The secrecy of their data is how Facebook makes money.

If the company that operates those servers gets shut down, I don't see how the group that takes over the business (whoever it is) shuts down the servers. Maybe everyone freaks out and Facebook loses a lot of people, maybe we don't get updates for a bit. Those seem possible, probable even. But things that make money don't get taken apart in bankruptcy.

I could be wrong! But I don't know of any historic examples where things like that happen.

Facebook did not keep user data secret, they used it as a lure to attract Farmvilles and 'Which Hogwarts House are YOU' surveys to make sure people spent more time on the platform, because having a billion people look at the website every day is how facebook makes money. Using the data they have to make sure ads are efficiently targeted is a par for the course -- but it does no harm to facebook if someone else has all my user data, if I spend all my time on facebook, facebook is the only one who makes money selling ads to me.

>I get that people want to see a company crippled but wiping out a quarter of revenue is a death-blow.

Yeah, so? Smaller companies get death-blow fines all the time, why shouldn't big sharks get the same?

$5 billion is a lot of money. It is not and never can be considered “chump change”. It is irrational to think a Board or management team isn’t going to seriously implement rigorous changes to avoid a $5 billion dollar fine.

Well, seeing that the board is powerless since Zuckerberg has a controlling interest in FB....

>I get that people want to see a company crippled but wiping out a quarter of revenue is a death-blow.

No, it's not. 15 billion is less than 3% of Facebook's market cap. It's less than the swing in Facebook's value after market close on Wednesday. Also probably less than the value they got through violating their consent decree.

A company's market evaluation has nothing to do with its internal cash flows. Put differently, 15 billion being less than 3% of Facebook's market cap says nothing about how much such a fine would affect Facebook.

Assuming the fine is one-off and market doesn’t devalue facebook because of the fine itself, couldn’t Facebook issue new stock to offset the fine, effectively losing 3% of stock value in the end?

Your question is basically "Couldn't Facebook just ask someone else to pay their fine?" – to which the answer is yes, they certainly could. But I highly doubt it'd work. I mean investors would basically be bailing out Facebook and wouldn't get anything in return.

Look at it from the perspective of existing investors: They would be forced to buy the new shares Fb is issuing in order to maintain their respective percentage of shares and, thus, their expected future dividends. There's nothing to gain from that – in fact, their investment's expectation value just goes down as they now need to pay an additional price for the same expected future return. Needless to say, shareholders don't value such a move, so any CEO trying to pull off such a thing would likely be removed immediately.

>There's nothing to gain from that

What about not getting shut down for failing to pay a fine?

I was talking about the investors not gaining anything from the company issuing new stock in order to pay fines in general. (Which also explains why this is not a common practice.) Fines that are threatening a company's existence are certainly a different beast as the investors' expectation value calculation then needs to factor in that if the company goes bankrupt they lose their entire investment. So refinancing the company by buying new stock then becomes more attractive because it will (or might) keep a shareholder's overall expectation value at least slightly above zero.

But even in these situations, I think a case can be made for why refinancing the company by having it take a loan is still more attractive. While from the perspective of an existing shareholder it will reduce future dividends in a similar way as new stock, there will at least be no opportunity costs. (If the company issues new stock, the investor has to consider what else they could do with their money instead of throwing fresh money at the company.)

Of course it does. They can easily convert 3% of their market cap into cash via bonds or issuing new stock. If the fine we're 150% of Facebook's market cap they wouldn't be able to.

I don't think it is as easy as you portrait it to be. (See my response to your comment's sibling.)

> I get that people want to see a company crippled but wiping out a quarter of revenue is a death-blow.

So is a lengthy prison sentence for individuals. Should we get rid of those to make it fair?

> There are numerous problems with this statement. One, you don’t know ahead of time what the fine will be.

You can take an educated guess going off of what happened to you and others on previous occasions. If there are no examples of a fine that is "a death blow", your risk to receive one is low. Hell, you might not even get caught. It's hard to tell how much they get away with because there's no leak or whistleblower.

I'm pretty sure that the reason FB isn't putting bogus charges on users' credit cards isn't that they wouldn't because of ethical reasons. It's that a) they'd likely get caught and b) they'd get into a lot of trouble.

Companies aren't sex offenders or committing crimes of passion, they are much closer to rational actors, deterrence works great on them.

unpopular personal opinion: facebook is the latest scapegoat of society. people are so much smarter than the NYT article suggests, and the so-called “power” of facebook is mainly just a reflection of our own internal flaws. the amplification of these flaws is actually very helpful: in order to fix them we need them exposed loudly/boldly. and so we reach the main pain point: privacy. since we’re extremely social animals, privacy could be considered to be anti-human. and thus our jungian duality (another flaw) is exposed in all it’s glory: we love to be loved, but our rational side counters our every move. facebook is just another tool to expose our animalistic flaws. one of many.

I dont think Facebook is merely a scapegoat. In the sense that the very existence of Facebook is threatening to the establishment. More than any other company in US, Facebook has (or used to have considering the declining active users) a direct connection to every citizen in the country through their timeline. If Zuckerberg was so inclined he could have had a very strong effect on the outcome of elections. No other corporation comes close. The closest in this regard are the big media houses (CNN, Fox etc.) who do wield influence but are regulated and have to compete with each other. Neither of these limitations exist for FB as of now. Think how the government would have acted if all the media houses merged into a single company under a slightly misanthropic CEO who hints at political aspirations.

Secondly, FB might amplify our flaws but this is a structural issue contingent on the very design of the newsfeed algorithm. Instead of blaming ourselves and embarking on a nationwide self improvement campaign we can think about minor design changes that could perhaps alleviate these effects.

I was with your post up until the final sentence, which struck me as a worrying enough point that I had to comment. (re: minor design changes rather than self improvement.)

The fact that the mentality has changed from "let's improve ourselves" to "let's legislate and limit" is part of what makes this debate so scary; this is not a new opinion for me however, I certainly tend to fall on the side of self-empowerment and positive (as opposed to negative) liberty in most cases and see facebook as a lesser of MANY evils of which the population/media has neglected. (Credit card data exchanges, NSA bulk collection, at&t room 641a, equifax, etc. This is far from the first time I've made a similar ramble to this)

Especially given that I tend to agree with the OP that it's a manifestation of our "lizard brain" that's actually the vulnerability here, and that govt. doesn't have a great history of managing these sort of things (Prohibition, sex work laws, drug war) I'm very remiss to give them more a hand in controlling the people.

ESPECIALLY given that, frankly, the way this is being legislated will leave facebook with exclusive control over their graph, as opposed to democratizing and making people aware of it. Even GDPR, for instance, will allow FB to compute aggregate and trend statistics on the graph data, which will be sufficiently deanonymized while preserving the bulk of the "insight" data they and only they gleaned.

I said a lot of things here and probably undermined my own point with some of them, but broadly, I think there's both evidence that it is a scapegoating, and that there are major pitfalls in how we're trying to address it.

Agreed. I don't think what Facebook is doing is ethical, but I agree 100% with you that the control is ultimately in our hands.. it first requires acknowledgement and then physical action.

I think Facebook demonstrates to us just how much of slaves we are to psychological manipulation. To me it all boils down to the same fundamental human flaw, which is addiction. Corporations have been taking advantage of alarmingly simple ways to lure us into consuming all sorts of products (tangible or not) for a long time now.

Since this has been going on for so long, I think another obstacle is that in acknowledging this flaw in our behavior, we will also be forced to confront other flaws of our society that are entangled within.

It's easier for people to blame a non-human entity or the Zuck(..?), than to confront that they need to make changes that will affect their/our lifestyle.

We all need to stop outsourcing the blame and make the individual efforts to change ourselves instead of wasting energy fighting this perpetual battle. We each have little to no control over the former and all the control over the latter. Wake up.

>Agreed. I don't think what Facebook is doing is ethical, but I agree 100% with you that the control is ultimately in our hands.. it first requires acknowledgement and then physical action.

Well, part of action to change yourself is to change your environment, so that you don't have bad habits within easy reach.

A huge fine to take down Facebook helps with that, and sends a strong signal to other social media takers...

It's not like "change yourself" only has to be some "from within you", will third parties are allowed to push lures straight down your face all the time...

Smoking wasn't stopped by just people smoking less, but also by big fines for tobacco, extra taxes, and so on.

yeah, of course, but a lot of this also has to do with the information that is available to us, particularly since the advent of the internet (although the spread of false information has shown to pose a threat to this). Smoking was also stopped because its negative health consequences became undeniable. Although, if you go live in France you'll see that a lot of people still choose not to act on this information. As far as I'm concerned, the consensus is out about social media, the way it's currently being used is detrimental to our health.

One large issue with these particular "addictions" is that we are hooked on them without being aware of their addictive nature. Whether or not the intentions were malicious are certainly something that needs to be addressed but the obstacle remains, which is that we have to individually endure the difficulty of withdrawal to break out of the cycle.

And that really is the case, it's withdrawal and withdrawal sucks. A lot of us already live demanding lives, experience depression/anxiety on varying but significant scales and don't really have the energy to devote toward the will-power that is required when distancing oneself from an addictive behavior. And it does require energy.

Of course preventing this on a large scale is a necessary goal, but we can make instantaneous changes now. Baby steps. I didn't delete my Facebook but I blocked the news-feed, uninstalled the app and hardly ever use it other than to see if people are trying to contact me.

facebook is the latest scapegoat of society

Correct: https://jakeseliger.com/2018/11/14/is-there-an-actual-facebo....

The way to punish Facebook is for users to stop using it; that most people don't, tells us a lot. Follow the data and behavior, not the speech and statements.

It could be a nash equilibrium, where everyone's best interest is for everyone to switch to a new platform, but for each individual person leaving Facebook costs them more (since if not enough people leave there's not a big enough pool of people looking for a new social network to have another valuable social network) than the small amount of damage it inflicts on the company.

If Facebook died we would all land on another platform, it's just very hard to move people over iteratively. The government could push that transition along, if it wished.

Any new platform would have the same issues

It's intriguing you bring up Jung. But what you are saying (that the removal of privacy helps integrate the shadow and thus "fixes jungian" duality in humans) -- is not at all what actually happens.

Our true shadow is not what Facebook exposes. Rather it collects data and exposes a subset which is in fact the anti-shadow.

YOU choose what data to expose on facebook. The act of that data being exposed has effects that are tangible.

If you've read Foucault he describes a similar effect during the Victorian era on human sexuality -- the idea that by exposing something and quantifying it -- you in fact put a sort of control on it. When it was hidden, no one knew what was normal -- so they used their intuitions and personal judgement.

It doesn't share everything -- because it can't possibly KNOW you (or your shadow) -- people are incredibly smart and complex -- as you point out. But what it does do is expose a superficial subset of you that in turn changes your behavior and the perception of your behavior.

Jungs whole point is that the shadow is an intrinsic aspect of any social society -- something MUST be repressed in order for us to not be completely individualistic psychopaths (like the snakes we evolved from) ... simply exposing this duality doesn't just "fix it" ... instead society needs to function with the knowledge that these things exist. Jung's point is that the individual should not go into psychosis trying to balance dualistic morals... and society shouldn't be so harsh in judging morals (NOT that everything in the shadow should be made public, or even that the shadow is a bug rather than a feature)

Except that FB makes shadow profiles for people who merely browse to pages with a “like” button on them - or other tracking services. They’re forcing themselves upon everyone.

Except this is how the internet works now. If this shouldn't be allowed then we should be banning it for all companies (including Google, just about every ad-serving website, and even the NYT themselves), not just the current scapegoat of the year, FB.

Good point.

Yet, to keep the pernicious effects of those flaws from being amplified by that tool at scale, it needs to be kept in check. Smoking habit may be a reflection of poor self-control on the part of the smokers, but that is no defense against Philip Morris fines.

Yeah for sure, I think that lobbying having infiltrated our politics is the issue here. Corporate influence/greed is what keeps these concerns from being addressed.

Also Philip Morris (at least the parent company) is now called Altria

Right. People at large are not really aware of the severity and expanse of own vulnerabilities -- addiction, coginitive bias etc. Or are even unwilling to seriously think through that angle.

This seems to call for a balance of perspectives and responsibilities. Partly similar is probably the case of traffic safety: driver's vigilance and responsibility has to be matched with sound regulation of all transportation profiteers.

Facebook criminally misleading and deceiving people about how data is used (i.e. email scandal) is a reflection of our own flaws?

I don't know if OP would agree with that phrasing, but I would. The oft quothed Zuck line about dumb fucks trusting him with their info for reasons he didn't understand dropped nine years ago; if you still trust the devil that called you a dumb fuck for trusting him, that's certainly a reflection of your own flaws. We have a more general Zuck-agnostic flaw of sharing information with entities that could misuse it, whether or not they've called us dumb fucks yet. I feel like engineering solutions to this would be more clever and secure than legislative ones. It's not that I want a company that shares my private information to be fined, it's that I don't want them to have my private information in the first place.

> people are so much smarter than the NYT article suggests

People are only smart some of the time, everybody can't be smart all of the time. If I followed you around for a week I'm sure I could find multiple instances of you doing something the audience would consider 'very stupid'.

>facebook is just another tool to expose our animalistic flaws

Right, much like heroin exposes our animalistic flaws of addiction, we don't allow companies to profit from it by using it as a tool.

You are right. And as Rene Girard famously pointed out, human societies need scapegoats from time to time.

I wouldn't mind beheading and burning this particular scapegoat on the altar of privacy, though, if it helps us ponder a little more about our own self-contradictory nature. If scapegoating is inevitable, I would much rather live in a society that uses legal fictions such as corporations as its scapegoats than one that beheads and burns actual human beings.

That's why I think fines should be a % of some wealth value of the law breaker.

I have always though for example that here in Mexico, traffic fines should be a % of the "yellow book value" of your car. Say, for passing a light in red, you get 5%. If you have a $1600 VW bug, then 5% of that will deffinitely hurt you. Similarly, if you have a $256,000 Ferrari, 5% will still be meaningful for you.

Otherwise, fines end up being just "the price of making business" or the price of speeding for wealthy people.

This reason is why, in some parts of the US, traffic court convictions for moving violations (i.e., run red light) carries both a monetary fine and a demerit points allocation against your license.

Exceed some small demerit points limit over a defined time period and your can be ordered into driver school or have your license suspended.


With the points system, a "rich dude" who see's a $500 red light fine as pocket change can't simply run red lights as long as they see fit. If the monetary fine is not itself a deterrence, the demerit points then become the deterrence.

Then what is the point of the fine at all, if the demerit points turn out to be the real deterrent? And if the cash is also supposed to be a real deterrent, why not make it so for said rich dude?

The cynical viewpoint says the fine is there for 'revenue generation' purposes.

You're right, of course. But a progressive income-based fine would generate even more revenue and wouldn't disproportionately affect those with lower income.

The natural conclusion for that is that all punishment be that way. Or maybe we should give out fines based on expected life-time income. So if you're young, you'll get a huge fine.

Or maybe we could also hand down prison sentences like that. If you're young and steal a candy bar we can send you to jail proportionally longer than if you're old.

Of course none of this makes sense, and fines based on your income is a terrible, awful idea.

Why is it an awful idea?

Fines are meant as a deterrent. Flat fines are either devastating for poor people or meaningless to rich people, or both.

Fines that scale with income or wealth allow reasonable deterrents to rich and poor people alike.

> Fines are meant as a deterrent.

No they're not. They're meant as a price. If they were meant as an unconditional deterrent then the penalty for everything should be death plus all of your assets.

That would maximize deterrence (and require us to be a lot more careful about what we pass laws against), but that's not what we're really after.

What we're trying to do is to keep the bad thing at a manageable level, which is exactly what proportionate penalties do, especially fines.

There are only a few crimes where the most important thing is deterrence, but those are the things we don't use fines for to begin with. For example, there is no fine for murder, the penalty is death, or life in prison, because we really are out to maximize deterrence there.

For everything else, if the fine is set appropriately then the cost to society of someone violating the law is less than the amount of the fine times the probability of being caught. If you then want to pay the price to do the thing, great -- we'll take your money and use it to save some lives somewhere else or do some other socially beneficial thing, and since you're paying more than the cost of the damage you're doing, everybody comes out ahead.

And if the fine isn't high enough to pay for the damage being done then it should be higher for everyone.

Why is it fair that rich people should be able to afford breaking the rules of our society but poor people not?

Why is it fair that rich people should be able to afford a Ferrari but poor people not? Why is it fair that rich people should be able to afford a house in the Hamptons but poor people not?

Because things cost money and if you want indulgences you have to pay for them.

Okay, so your stance is "indulgences are good."

How are they not good? The reason we have rules is that breaking the rules has costs to society. When the value of breaking the rule is worth more to you than the cost to society, it's a net beneficial transaction on both sides to let you do it in exchange for a premium.

The only way it's a problem is when the value is set wrong, so that you're only paying $100 but causing $1000 in damage. But that's not the fault of the premise of pricing damage, it's the general problem with corruption or government inefficiency. And the alternative is that the corruption causes you to be able to do $1000 in damage and pay nothing because the whole thing is swept under the table -- at least this way it's happening in the open and the public can evaluate whether the price is appropriate.

"The reason we have rules is that breaking the rules has costs to society. When the value of breaking the rule is worth more to you than the cost to society,"

What if the harm to society of a rich person breaking a law is more than the harm of a poor person breaking that law? The positive actions a person does may be magnified by wealth; why wouldn't the negative actions be?

You can read Martin Luther's writings on the matter, if you like.

That doesn’t seem like a problem to me.

"For example, there is no fine for murder, the penalty is death, or life in prison, because we really are out to maximize deterrence there."

It depends on the state. In Missouri, 2nd degree murder has a minimum of 10 years.

> Or maybe we could also hand down prison sentences like that. If you're young and steal a candy bar we can send you to jail proportionally longer than if you're old.

Don't we do that somewhat based on income, only it's reversed? If you're poor, you get a harsher sentence - we might as well make it official.

We're doing something like that in Germany where some fines are calculated by how much you're making in a day, so you'd get "30 days' worth".

Unfortunately, they aren't actually looking a your assets and income, but make a guess and if it's too high, you can provide proof that it's not accurate. It's usually far too low when you're in the top 5%.

> Unfortunately, they aren't actually looking a your assets and income

Wisdom of the policy aside, why is it implemented in such a seemingly insane way? Why not just look at the equivalent of last year's W2? While not perfectly representative of steady-state income, it's what many other income-based systems do and you can easily have a petition process for claiming that that year's income is unrepresentative.

Mostly because of privacy laws. A court can't simply access tax documents, even if it might be helpful in this case. If they wanted to, they'd have to go the official route and the judge ruling on that would weigh the interest of the individual to keep his info private against the interest of the court to gain information. This obviously is easy when the documents are related to the crime, but as the crime/misdemeanor has already been decided upon and only the sentencing is to be done, privacy is usually given more weight.

At least for a time, in the late 80's, in England some traffic fines were determined by the offender's income.

Force a 1:3 stock split on all non-gov owned shares for all ownership classes with 1 out of 3 of the split shares going to the government every time a big publicly traded company ignores or breaks rules on a large scale.

Publicly traded companies are supposed to be responsive to shareholder concerns, and shareholders would be terrified of penalties denominated in stock. Pittance dollar bill fines are unlikely to ever work because finding a "just right" size is too difficult.

This would just make the government invested in the status quo just like investors, or all the sharks would get into office to vacuum up the companies for power.

Instead of "the government"; have them reward stock to the people whose privacy they've violated?

So, just steal 1/3 of the company? Seems a little extreme! Also I don't think Americans would go for it - they are very against nationalisation of anything (too "communist").

Saying that $5 billion is "chump change" is silly in any context and on a related note I keep hearing newscasters and prodcasters bemoaning the fact that Facebook is thriving and it disturbs me to no end that these people are publicly and loudly advocating for an important and unique company to be killed off, and I'm not sure about the psychology or the motives behind it.

What's clear is that they are blaming facebook for political outcomes that are divergent from the mainstream and are conflating that with all that is negative and upsetting in the zeitgeist

As with doctors, journalism has an aura around it of being populated solely by noble, honest, and impartial folk. Unlike doctors, there's far less-buyin to this mythos from people not working in the industry, but it still exists: think about how many people you know whose "beliefs" consist largely of uncritically repetition of takes from the handful of publications that they have chosen to trust deeply.

In reality, any industry populated by humans (perhaps excluding those with extreme barriers to entry) is going to have people on most points of the spectrums of stupidity/intelligence and honesty/dishonesty. There's no bar that means you have to be particularly smart or intellectually honest to be a journalist.

The upshot of this is that when you see issues that systemically[1] affect an entire industry, you can expect to see this bias to a pretty blatant degree. A banal example is that of Google Reader: its loss was perhaps worth bemoaning, but the sustained hysteria from the media was insane (hilariously, this was around the same time they were pronouncing Google+ as dead, despite it having a multiple of in-stream DAU compared to Google Reader).

So, to finally get to my point: the upheavals that the media industry has been going through the past decade is (fairly) blamed on tech and (perhaps less fairly) personified by large tech companies. That doesn't mean dismissing any criticism of tech, as there are plenty of valid ones, but reading coverage critically is always valuable, and particularly so on topics where they have an incentive (emotional or otherwise) to dissemble.

[1] or even statistically: the claims of both pro-Democrat and pro-corporate bias in large media are IMO exaggerated but are also likely true to at least some degree: they're an unavoidable consequence of the industry being heavily skewed towards people/entities identifying as Democrats working for/being large corporations.

By creating competition.

And that can be done easily. The reason it's impossible to compete against Facebook is because of the network effect. Kill the network effect. If you have a site with more than let's say 100 million active users you are required to license free user generated content (such as posts) under a copy-left license, unless the user opts in to a restrictive license on a per content piece. All content, opt in or out out, must regardless be treated the same by the site - no incentivizing people into opting in by effectively punishing those that don't.

And furthermore require that the site provides both an API for accessing all published content, as well as remotely publishing by users. You've now not only 'punished' facebook, but really fixed a huge chunk of the entire tech industry's problems by killing the network effect. Now competitors can create their own alternatives that users from e.g. Facebook can use and even keep and interact with all of their friends on. With no network effect, it's now simply a competition to see who can provide the best user experience.

FB has offered such API for almost a decade already

You add a 0 to the fine then move on to the fantastically profitable companies who are doing exponentially worse things than Facebook (Exxon Mobil, Shell, BP et al).

That’s still not enough. Require FB adhere to the same regulations as other media. The stuff that requires political ads disclose who paid for them. Maybe there need to be clearing houses for ad campaigns. Impose regulations that require governmental vetting/oversight. Tax data use. Novel regulations to bring FB to parity with the rest of media. Maybe put a tax on the time and data people share with FB services. Force disclosure of the per hour/view/use, per person revenue of their services.

You realize that FB already does all these things, right?


Oh.. So you want to go back to the old scapegoats..

"Scapegoats" is a fascinating description for "the people who literally knew about world-wrecking climate change in the 1970s and proceeded to tell no one and embark on disinformation campaigns to try to keep that genie in the bottle."

We could debate this a long time. I work for an oil company. So I suppose I will be biased, and that makes things difficult. But, I will leave you with this. Are you sure every oil company did this? And by "this" I mean, convinced that the science in the 1970s was certain and unambiguous as to the impending destruction of the planet, decided to hide that information. Also, keep in mind that whenever they did break the law (as it existed. Yes I know of regulatory capture.) they were punished. We can quibble on the magnitude of the fines, but then again that is indeed what the current article is about. Hence my comment about the equivalence of the scapegoats. Keep in mind, I am not a climate change denier. I am just saying that it took the acceleration of the effects to clarify the cause and effect to a point that it is unambiguous and it was not the case back in 1970.

Now contrast the messy science of climate change with very controllable A/B testable situation with privacy violations and echo chamber amplification and large scale social engineering that is happening with Twitter Facebook etc. I would argue that they know exactly what they are doing and choose to continue only because... profit.

> Are you sure every oil company did this?

Fines? Come on. Were there justice we'd have people shot dead in the square then buried unmourned and in unmarked graves for what they did to us as a society and as a race. Continuing to assess fines as the magnitude of their betrayal of society and of the human race would be getting off light.

Because here's the thing: if they weren't sure in the seventies they were damned well sure by the eighties and they're damned well sure now and they still fight to put profits over the future of the human race. They are the counterpoint to the ghastly unfettered capitalism that some people still think is a good idea and they're going to end billions of humans because of it.

"Not all oil companies"? Get out with that. There is no exclusion from this betrayal of the human race and there is no oil company in existence today not blighting us by their continued operation. If some hypothetical oil company doesn't want to be strung up? I dunno. How about 100% renewable energy by 2035--which is probably still too late, because we are screaming over the cliff, but hey!--and externally-validated commitments to that, with failure to meet those commitments resulting in revocation of their corporate charter, forcible seizure of all assets, and jail sentences long enough to see the sea rise they caused for their executive tier? That seems equitable to me.

Sigh. A violated society should act, but we are passive.

Isn't this just a rehash of


I wish one day real journalists will look into how the "anti-facebook" media blitz happened. I've never used facebook and I hate the idea of facebook, but the politically motivated attacks that seem to stem from the highest level of governments ( across the atlantic as well ) and amongst the media need real investigating.

At this point, it isn't objective reporting, it's ideological advocacy grounded in political motivation. From zuckerburg and sandberg being accused of being antisemites ( when they themselves are jews ) to facebook being blamed for everything under the sun, one has to wonder what the hell is going on. There definitely is a story here, but it isn't facebook, it's the media. I doubt Mao's propaganda against the bourgeois was as relentless.

Yeah an investigation is needed, what is this privacy thing they are talking about? And why would politicians care about that? Which company is lobbying them to care about their voter‘s rights?

These news organizations and politicians care so much about privacy that they all have facebook accounts? I've never had a facebook and never will. I care about privacy. I don't think they do. So that leaves me wondering what they really care about.

Also, facebook and news companies are both in the same business - the ad selling business which is the cause of the privacy issue.

But that's not my concern. My concern is the obvious politically motivated move against one company and one industry by politicians, government agencies and news companies. When people start calling zuckerburg and sandberg ( who was hailed as a female hero by the media for many years until recently ) antisemites, you have to start asking question. No?

Don't you think it is odd that news companies ( who are supposed to be objective and report the news ) are advocating for fines? Shouldn't a news company simply report the fine rather than demanding more fine? Where's the objectivity? Shouldn't journalists report what is happening rather than advocating for something to happen?

I'm for reporting on facebook and privacy. It's an important issue for sure. What I also want is reporting on the reporting itself because that seems to be an even bigger story. I may not be the sharpest tool in the shed, but even I can see that there is more than reporting and news involved here.

And it's hard to take the news concerns about privacy seriously when they are all demanding preferential treatment on facebook's platform. Why don't the nytimes close their facebook account? Why don't they pull their headlines from facebook rather than demanding facebook prominently display their headlines and give their headlines top billing? When their words and actions don't align, it makes me wonder. I'm always wondering.

Add 1000ms to all of their requests

> How Do You Punish Facebook?

You make Facebook and all social media content fall under the telecommunications act. This would mean that FB would lose its network effect (other parties would be able to provide services that interact with users on FB; just like telephone users can call eachother even when they are on a different network).

Antitrust has had solutions to this forever. When a business misbehaves, the government settles by entering into a consent decree that involves government lawyers crawling inside the company and laying their eggs. Then they live in there for a number of years. They get lots of information about what the company is doing, as it's happening, and can put a stop to stupid nonsense before it happens instead of trying to punish it after the fact. They can also require the company to do things to mitigate the damage, like not collecting or retaining information unnecessarily.

In general, do things that aren't cost effective or efficient for companies that haven't misbehaved but make a very effective combination of punishment and reform.

But that's assuming the goal is actually reform rather than just lining government coffers.

Add another zero? Or maybe two?

I mean, here's the thing. If you spit on the sidewalk or look funny at a cop, you can spend months in jail. Facebook is doing some bad stuff to a whole lot of people, all the time, flaunting the law, without remorse and repeatedly. There is absolutely no reason why considerations like "we must make sure Facebook survives as a going, profitable entity" should enter into punishment considerations, any more than "we should make sure this guy's life isn't harmed" enters into our punishment considerations for some guy walking down the street with two joints in his pocket.

Governments are not really interested to “punish” FB as long as they can regularly “milk” FB for fines.

Cost of doing business for large international corps.

Take the 5 billion, give it to a decentralised opensource FB clone project. Same about 100 million for a global advertising campaign.

social networks with a lot more capital have tried and failed. it's the network effect and the inertia of migrating that keep people on the network, not the lack of a "fbclone"

The other social networks with that amount of capital were trying to be Facebook, i.e. another closed network that they can profit from. Facebook just under a different name has no relative user advantage and no reason for users to switch.

An actual decentralized network has different requirements and can thereby attract users by having different characteristic advantages.

For example, one of the large and growing problems with centralized networks is that everybody leans on them to do censorship, and they have insufficient stake in protecting individual speakers, even against unreasonable or blatantly fraudulent censorship requests.

If you make it trivially easy and inexpensive for anyone to self-host then the person making the censorship decision is the person whose content it is to begin with, which brings back the proper alignment of incentives regarding cost of removing content vs. liability for hosting it.

And that's just one advantage. Another is you control your data instead of Facebook and it's only you and your own friends who have access to it, not some corporation using it to run adversarial machine learning against you. Another is that your data is local and is therefore faster and available in places with poor internet. Another is that you have access to it yourself and can make modifications to both code and data, and then you get improvements from everyone instead of being restricted to a single company with a conflict of interest.

These kinds of things add up, but first you need somebody to spend the resources to develop it into something that non-experts can easily use and understand. Which is already happening, slowly, but would happen much faster with those kinds of resources.

How do you eat an elephant? One bite at a time. Someone please serve them a $€£5bn penalty. And actually collect the FULL amount. 6 months later, on the next ...... they make, drop another $€£10bn on their heads. And yes you guessed right, collect it. Then 6 months later on the next ..... they make, 'award' them with one more $€£10bn penalty. By that time they will be bleeding so much cash and users that they will either get their shit straight or they will go out of business.

Unfortunately I don't see any other way for those thugs (FB) to come clean.

Simple. Big federal prison terms for the CEOs and top executives.

I liked the idea of making them delete their data and start over under strict supervision. They've proven themselves incompetent at handling personal data, so they need to be supervised when they do. It will never happen, but one can dream...

More realistically, the board and executives should be personally liable for a fraction of their wealth. Fine Zuckerberg and Sandberg 10% of their hoards, and things will change.

They're starting from a false premise. The idea isn't to 'punish' Facebook, the people fining them really couldn't care less.

What they want is to line their own pockets, they do that by fining them in a way the stupid uneducated public supports.

The real way to 'punish' Facebook would be to hold Zuck directly accountable. i.e he personally faces jail time unless he does x,y and z etc.

What about forcing Facebook to shut down and redirect all traffic to a page detailing what Facebook has done to earn itself this position? Throw in some awareness of social media addiction and how it can negatively impact lives.

A short ban would be a great way! They say that they make free speech available for everyone but intern they use free speech and make a tonne of money. Also turning us on one another by letting fascists reach everyone easily!

Well an easy way is for all the big US media publications to move their HQs in the EU, then make facebook pay every time their links are being shared. Sounds like a shrewd decision they can make today.

Jail Zuckerberg. The buck stops with him. He should be personally liable.

Facebook didn't happen to society. The internet happened to society.

The only punishment that will work is where the people will be held responsible and criminal law will be indicted, otherwise, its just money.

Not to be crass... but how about multiple the fine by 10 if one believes it's not punitive enough?

By breaking it up and regulating it

Easy, put executives in jail.

Raise their tax rate and keep raising it.

Blackhole their ad servers.

Add a zero?

corporate death penalty

$50B fine.

Nothing changes until people start to go to jail.

Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact