Hacker News new | past | comments | ask | show | jobs | submit login
Facebook paid $4.5K for disclosure of my user account exploit (yvoschaap.com)
117 points by yvoschaap2 on May 31, 2013 | hide | past | favorite | 123 comments



   Churchill: "Madam, would you sleep with me for five million pounds?"
 
   Socialite: "My goodness, Mr. Churchill... Well, I suppose... 
               we would have to discuss terms, of course... "

   Churchill: "Would you sleep with me for five pounds?"

   Socialite: "Mr. Churchill, what kind of woman do you
               think I am?!" 

   Churchill: "Madam, we've already established that. 
               Now we are haggling about the price”
You're either a black/grey hat or a white hat. Either you're a white hat and believe selling to malicious hackers is fundamentally wrong and you wouldn't do it at any price, or you're a black hat waiting for the right price.

The purpose of reward schemes is to reward white hats, not to compete with the bad guys for the black hat discoveries.


> You're either a black/grey hat or a white hat. Either you're a white hat and believe selling to malicious hackers is fundamentally wrong and you wouldn't do it at any price, or you're a black hat waiting for the right price.

> The purpose of reward schemes is to reward white hats, not to compete with the bad guys for the black hat discoveries.

How many other industries are there where individuals with valuable skills routinely volunteer to help multi-billion dollar corporations despite no guarantee of reward?

These people are doing work that typically warrants a six-figure salary or several hundred dollars per hour, and they're doing it almost entirely because it's the right thing to do. And Facebook should reward them well enough that they'll continue doing it, not only because it's good for the security of Facebook and its users but because it's the right thing to do.

I don't know how much Facebook has given to the other 65 people who disclosed exploits this year, but it will be innocent users who suffer most if they all share Yvo's sentiment.


These people are doing work that typically warrants a six-figure salary or several hundred dollars per hour

He spent a day finding the exploit, and got $4500. That scales quite well if he can keep it up.

At the point the exploits are harder to find, Facebook can make a decision as to whether it's important to keep searching as hard, and raise or lower the price as they see fit. This is the market in action.

He might have made more on the black market, but why it would be worth more is important. On the black market, the transaction comes with legal risk. Risk increases payout (by reducing supply).


There would be very little legal risk associated with selling this on the black market. The problem would be finding the buyer. The prosecution would have to prove that the seller knew for a fact that he was selling the vulnerability to a known criminal.


> Either you're a white hat and believe selling to malicious hackers is fundamentally wrong and you wouldn't do it at any price, or you're a black hat waiting for the right price.

That sentence is such a false dichotomy it's hard for me to take you seriously.

Is it evil to want to feel appreciated for your work? The message facebook is sending is that they honestly don't care if you find huge, potentially costly security holes in their software and go out of your way to let them no.

edit: It seems from reading other comments $4500 is actually quite reasonable. I was basing my comment on the author saying it was a "paltry fee".


Facebook is doing the right thing here. Very few companies have a responsible disclosure policy, much less a reward system.

You take a huge risk even notifying companies of a security flaw you found, since that usually implies you were doing unauthorized penetration testing and they'd have a case against you under the oh-so-wonderful CFAA. Or they'll just ignore you completely and never patch the flaw.


Huh? They cared enough to send him $4500 as a thank-you. Their obligation in this case was zero.


I'm sorry, but this is naive. And your anecdote argues against the point you are making below it. Most people do have a price, even a "good" people. And good people like OP are more likely to stay good when they feel it's appreciated. I think the price FB paid him is almost like a slap in the face, and could have the effect of antagonizing otherwise helpful people.


$4500 for a website auth bypass is not a slap in the face.


Yes, it is. Knowing this exploit was worth many orders of magnitude more to facebook. You can't think of the money as an absolute amount and say, "hey, that sounds pretty good." It has a market value, both to fb and to blackhats, and that market value is far higher than $4500.


Have you sold a lot of vulnerabilities, then? Can I ask if you're speaking from experience here?


One other thing to consider here is that these $4,500 are not packaged with a felony.


Selling vulns/exploits may be distasteful, but it's not illegal.


If you sell an exploit to someone who then uses it for illegal purposes you could be prosecuted for your involvement in that crime.


Only if they can prove that you knew exactly who you were selling to and for what purpose. This is a pretty high standard to meet...


If you say so. I think you're probably wrong about this.


Also, being listed on FB's responsible disclosure page isn't worthless.


The black hats exist either way because there is a lucrative market. If there is no incentive for white hats, they just won't play the game. It's not that they'll "turn into black hats", but the effect for Facebook and their users is pretty much the same: the only people looking for exploits are black hats.

If they value responsible disclosure, they should pay enough to make it worth the effort. They need not compete with black market prices.


> If there is no incentive for white hats, they just won't play the game.

There are more incentives than money. Some people like the thrill, some like the "thanks!" from the person they helped, some just like fucking around with web security.


Of course, your whole philosophy assumes everyone looks at the world in terms of Good Guys and Bad Guys, and has the same idea about who they are that you do.


Rather, his philosophy assumes that if you are a white hat you cannot be concerned about a reward. In reality white hats simply have enough moral compass to not place the exploit into wrong hands, which doesn't mean they don't care about a reward. By this rationale lawyers, for instance, should not be setting a price on their time when they believe defending a particular case will be doing a good Good Deed.


These companies don't have to compete with the bad guys, but smart companies would be competetive with each other for the finite amount of time that hackers have.


Less money = less incentive = fewer disclosures = less secure.

Facebook is abusing the good will of white hats by offering such trivial sums, and they're reducing the security of their platform in the process. They have how many $100k+ engineers who couldn't find this? And how much does the average security breach cost per record, $100-$200? This exploit alone could have exposed them to millions in losses at that cost.

This is what turns white hats into black hats, and I wouldn't blame the guy for selling his next exploit rather than disclosing it. A famous guy once said "we create our own demons". And then the guy in Iron Man 3 said it. And now I'm saying it.


"Trivial sums"? This is squarely in line with what Google pays for vulnerabilities. Who is paying drastically more for website flaws?

And, because you think the thank-you Facebook offered was too low, you wouldn't blame him for selling vulnerabilities to criminals? Really? Selling vulnerabilities to criminals is itself a crime.


> Who is paying drastically more for website flaws?

Black hat markets, presumably. At least that is the point being made by commenters here. Granted, selling the vulnerability is illegal and immoral, but that doesn't stop it from happening. The 'market rate' for vulnerabilities seems to be higher than what Facebook and Google are paying out.


Why do you "presume" that? Not all vulnerabilities are equally valuable, and the value for a vulnerability is not as straightforward as people here seem to think it is. Or at least, I don't think it is.


I use the word "presume" because I don't frequent black hat markets and I have no personal experience with current pricing. The general agreement I'm seeing in the comments (and anecdotes gathered elsewhere) is that exploits and vulnerabilities command a higher price when sold to black hats rather than responsibly disclosed through a bounty system. (Isn't this what the grandparent and article are implying?)

This makes sense economically to me. In order for it to be worthwhile for a vulnerability discoverer to sell the exploit, the reward should overcome the cost. In this case, the cost is the probability of getting caught multiplied by the severity of the punishment.


> Granted, selling the vulnerability is illegal

Really? That's exactly what you're doing with Facebook - selling a vulnerability to them, which they then pay you for. So, disclosing to some third party ought also to be fine. The morality or otherwise is up to you though, I guess...

EDIT - I just read @tptacek's reply below. I guess that selling to known criminals, with the knowledge they would use the exploit to commit a crime, _is_ going to be illegal most places.


How is it a crime to sell a security flaw? It is just knowledge that you found yourself (as opposed to was told under an NDA, in which case it might have been).


It's not a crime to sell a security flaw! It is, on the other hand, probably a crime to abet computer fraud, which is what you'll have done if you accept money from someone in return for an exploit you had that they subsequently use to break into Facebook.


That could be stretched to making it a crime to sell maps (plan a getaway), kitchen knifes (stabbing) or gasolin (arson).


It would fall under 'conspiracy' in most jurisdictions.


Selling vulnerabilities to criminals is itself a crime.

'daeken seems to disagree with you[0]. Is he correct, or is the "to [proven] criminals" in your statement important?

0. https://news.ycombinator.com/item?id=5799382


Simply selling vulnerabilities isn't criminal (it's a bit of a grey area, but if I didn't have ethical issues with the practice, it's so far onto the "safe" edge of the spectrum that I'd be fine with assuming the risk.)

Selling vulnerabilities to people you know to be criminals, or to people a prosecutor can convince a jury a reasonable person would have known to be criminal, probably is a crime.


Joe Sullivan, Director of Security at Facebook said publicly during the SF New Tech Security event this Wednesday that Facebook purchased the Java 0Day run in their training exercise[1]. I guarantee that 0Day was more than $5000.

[1]http://arstechnica.com/security/2013/02/at-facebook-zero-day...


It may have been, it may not have been (we don't know the terms), but it was a clientside driveby RCE, not a web app bug.


The "to proven criminals" is key. Many people in the industry make a lot of money selling to governments and private companies like Vupen.


And by many people, you include yourself, right?

Long before you disclosed the vulnerability in Onity hotel locks to the public, the startup you had co-founded "licensed" the same flaw to Lockmasters Security Institute, a company that trains law enforcement agencies, special ops, and intelligence agencies in covert entry techniques.

You gave LSI's government customers a pretty big head start before you bothered to disclose that flaw to the general public.


Yes, I do include myself in that.


Exactly what was the point of this comment, Chris? He wasn't making judgements. He was explaining something that he has personal experience with.


If researchers in this community are going to sell security vulnerabilities to the government, I think that fact should be well known.

daeken's work on hotel locks got a lot of press, but the fact that he had two years earlier sold that info to a company for "law enforcement purposes" hasn't gotten nearly the press attention it deserves.

Martin Muench and Chaouki Bekrar have openly embraced what they do. As much as I dislike the path they follow, I have to at least respect them for being up front about the business they're in.

If you're going to help governments covertly break into people's homes, computers, and smartphones, you should wear it with pride.


What exactly does this have to do with the matter at hand? What did you choose this particular thread to make a point about Cody? Cody was contributing his insight about vulnerability markets, which is something he knows a little about (unlike most thread participants). You seem to have chosen it to make a political point at his expense. That's not neighborly and it's not germane to the issue at hand.

With your response you make the problem even worse, by pointing out other people who made decisions that you, Chris Soghoian, don't approve of but who you "at least respect".

I also think it's a little laughable to suggest that Cody in any way enabled the USG to break into hotel rooms, as if that was a capability they were just champing at the bit to buy from someone like Cody rather than something they've been able to trivially accomplish for the last 200 years.

I've worked with Cody in the past, consider him a friend (despite his different stance on vulnerability market), and have a problem with comments that chime in on threads for the sole purpose of trying to take him down a peg.


Cody, on his own blog, described the sale of the vulnerability as follows:

In 2010, we (the startup I was running with friends at the time, UPM) decided to license the opening technology to a locksmithing company for law enforcement purposes.

If it is "laughable to suggest that Cody in any way enabled the USG to break into hotel rooms", then why would he describe the sale as "for law enforcement purposes"?


I stand by everything I said in my previous comment while noting that you didn't respond to the main point of that comment or the one that preceded with it.


What an absolutely retarded argument.

As someone mentioned above, apparently this is the same amount of money that Google offers for security exploits. How is it any different?

Also how much money do you expect? You should expect NO money for doing the right thing. Rewards like these are simply gestures of goodwill and you should thank Facebook for even offering something like that.

Giving such exploits to the black market is WRONG. It's morally unjust. Money shouldn't change the situation any more.


On the other hand, I feel significantly more secure on facebook than on paypal.


I think many people reading this get a sort of uncomfortable feeling thinking about 'white hat' security researchers receiving bounties for disclosure -- it sounds a bit like extortion, especially when people talk about "wondering how much more you can get elsewhere."

But at the same time, the reality is that we're in somewhat of a security crisis. Businesses responsible for the security and privacy of our personal information and identities are clearly not in fact capable of protecting those on their own. (In part because it is a very hard task). They need help. And they're not going to get enough help purely from unpaid volunteers.

In a more reasonable world, the government would have armies of 'white hat' hackers trying to find security holes (they surely do), and then _telling the effected about them_ (they definitely don't, they keep them instead for their own use). Because that would result in increased security for us all, isn't that in theory the mission of police agencies, increasing our security?


Unfortunately or fortunately, I think much of the world would disagree with you and argue that such behaviour would be 'meddling' and gross overstepping by the government, right now.

Perhaps this sort of governmental 'help' would be possible in the future, but right now it seems that such behaviour would almost certainly result in cries that we had entered into a police state. :/

Edit: he notes that when he says 'we', you and I are almost certainly not in the same nation state. :P shrug


The U.S. government (and several others) is surely already aggressively doing vulnerability discovery. [1]

They just keep the exploits to themselves, to use for their own spying -- instead of sharing them with the public so they can be patched, to increase security for everyone.

Which course of action is more of a 'police state'?

[1]: “You’re commuting to where the information is stored and extracting the information from the adversaries’ network. We are the best at doing it. Period.” -- Michael Hayden, who headed the NSA, and later the CIA, under Bush.

http://www.businessweek.com/articles/2013-05-23/how-the-u-do...


I don't see it a much different from the fire chef telling you how many people can occupy your building. FAA inspecting aircraft matence schedules, FDA inspecting a slughterhouse, the fed keeping track of a banks balancesheets or thousands of other areas where private companies interests are not aligned with the public.


There are and will be growing pains as this becomes reality... for example, there would be unintended consequences one might even imagine one such as:

" you get fined for hosting ogv videos because they're deemed unsafe and the FCC decides you should be fined per video that is not an MPEG format... after all these unsafe video's hurt people real people "

So, yeah the government could be a good vehicle for helping to secure our internet, but it could also have unintended consequences of preventing growth and innovation under the guise of "security" e.g "fear"


...intrigued by the bold sentence on Facebook's security researcher page "There is no maximum reward" I went out and started giving Facebook's code another peak.

I'm not surprised that you feel this wasn't enough since it appears the reward is your motivation for finding an exploit in the first place. The reward shouldn't be viewed as payment for services rendered, but rather as a gesture of good will for performing your duty of responsible disclosure.

Without the disclosure of whitehat hackers, like I did, these exploits can also become available to dubious parties who could wreak (digital) havoc.

If you are truly a white hat then you aren't motivated by money. You are principally motivated by wanting to make the world a better, safer place. Since you're well-versed in computer security, that motivation will propel you to apply that knowledge to the web (or really to any vulnerability) when the opportunity presents itself.

If the reward is your motivation then at best this makes you a grey hat. You did the right thing, but for the wrong reasons. Note that for the following I'm not saying you personally will necessarily do this; but it would be easy from that position to start treating the group of people who you're willing to disclose a vulnerability to as a marketplace. Facebook will pay $4,500? Fair enough. L33tBotNetHaxz0r will pay $20,000? Done.

If you're someone who possesses the skill to discover security vulnerabilities, I think it's important to think through what your real motivations are. Is it for the rush capturing the flag? Is it for the money? Is it to help people? Your actions can affect many, so act wisely.

Edit: I should clarify that I mean that money can't be the principal motivation for a white hat hacker. I believe it's fine as a secondary motivation. I give an example in another comment below.


> performing your duty of responsible disclosure

He has no duty. Facebook engineers do, but it was him who found the bug. My understanding of the article is that he doesn't care how much money he gets from Facebook.

The 4.5K is representative of how much they care about customer privacy. That's what, I think, his only point is.


He has no duty.

If he is going to use the white hat label, which he does, then he does in fact have that duty. If he chooses to hide the vulnerability or sell the knowledge and/or exploit to another party, that's his choice of course, but then he cannot use the white hat label.

The 4.5K is representative of how much they care about customer privacy. That's what, I think, his only point is.

I understand that's the point he's trying to make, but I respectfully disagree since as previously stated I view the reward as a gesture of good will and not as a payment for services rendered.


Responsible disclosure isn't a requirement for white-hat status, white-hat status just means you don't do any harm. Full disclosure can also be white-hat as can sitting on the bug.


white-hat status just means you don't do any harm

This definition falls short. For example, the hacker who spends time searching for a vulnerability with the intent to do harm, but fails to find one. The hacker has done no harm, therefore by your definition he or she is a white hat despite the fact that they would do harm given the opportunity. For that reason, a person's motivations need to be taken into account in order to provide a proper assessment.


His only duty was to report the bug. He didn't have to look for it. That's what I meant.


> If you are truly a white hat then you aren't motivated by money.

False.

> You are principally motivated by wanting to make the world a better, safer place.

Probably true.

That's not contradictory. And neither is wanting to disclose a vulnerability responsibly, wanting Facebook to be secure against the vulnerability you just found (even only because it affects you, your friends, family, people you care about) and at the same time feeling that $4,500 is a bit of a meagre bounty for a find of some particular severity.

You can argue about whether $4,500 is, or is not, reasonable. But as you may notice, it's not like he held the bug "hostage" threatening to sell it to the black market otherwise, or even hinting at perhaps doing that the next time.

He can be wanting to make the world a better place (or Facebook a safer site), and still be dissatisfied with the reward he got.

Even rationally: for instance, one could reason that even though they could sell the bug for $20k on the black market, this could cause damage and harm to innocent people on a scale (and with a certainty, because they won't buy the bug to just sit on it) that isn't worth the extra money he could gain (and for people with a decent moral compass that threshold is passed easily without even having to consider the risks of dealing with the black market etc). Then, having come to this conclusion, one could say "Well, I did the right thing, and I am disappointed that Facebook isn't offering more incentive for others to do the same".

Also, the incentive is not just for convincing people to give the bug to the good guys instead of the bad guys. It's clear from the article that he spent a deliberate and non-trivial amount of effort on finding this bug, he didn't just stumble across it. This Facebook bug was found because they offered a mystery prize, prompting him to search for a bug and dig deep, then he got disappointed that the prize was less than he anticipated. It's not so much a question of "will he sell it to the bad guys next", rather it's "will he bother going through that much trouble again to help fix a flaw in Facebook, knowing what the payoff will be?"

That said, my question: is $4,500 really too little? How much hours did this cost him (times a price appropriate to his specialist knowledge, of course), so how much should it have been, for doing the right thing? 2x, 3x this? I don't think straight up matching it up to black market prices really makes a lot of sense, there's too many external factors that make it a very different deal.


> is $4,500 really too little? How much hours did this cost him (times a price appropriate to his specialist knowledge, of course), so how much should it have been, for doing the right thing? 2x, 3x this?

I don't think that's the right way to look at it—it's not a straight manual labor thing ($/hour * hours = $). The question is, how much money is it worth to Facebook to fix a potentially embarrassing vulnerability. For a complete profile exposure type bug, I would expect that's more than $4.5K.


I should have said "principally motivated" instead of just "motivated" in my original statement about white hats. I'll add that as a correction to the bottom of my original post. The main motivation of a white hat can't be money, but sure money can come into play secondarily. To further demonstrate:

As a hypothetical example, let's say there are two parties in which it would be equally responsible to disclose a vulnerability to (perhaps they jointly developed the product or service) and as a condition of payment both separately require that the vulnerability not be disclosed elsewhere, theoretically including the other party (hopefully that wouldn't be the case in reality, but again, this is hypothetical). Now let's say party one offers $4,500 and party two offers $20,000. The choice of either party is ethical on the grounds that both have been defined as equally responsible recipients of the disclosure. Therefore a determining factor is still needed, in which case, the amount of money offered is a perfectly reasonable criterion.

So in short, I agree with you on that. I may not have been completely clear in my original post.


>If you are truly a white hat then you aren't motivated by money

Says who? People who would never commit a criminal cyber-act are still allowed to care about money.


I never said they weren't allowed to care about money in general; only that it shouldn't be their primary motivation for disclosing a security vulnerability. Even if it is, I'm not saying that makes them a bad person either; only that they can't use the white hat label as the author did in my second quote.


Much of the conversation here centers around the value of reporting to Facebook vs. selling to black hat. This is the wrong paradigm to view this issue through.

Taking the view that selling to blackhats is ALWAYS wrong, it may still make sense for Facebook to pay significantly more to find vulnerabilities in their system. A less vulnerable system is one with a competitive advantage, and I think Facebook is missing an opportunity to tout their security credentials.

Let's take a back of the envelope calculation. Say instead of $4,500, they paid each of the 66 people who submitted a vulnerability $50,000. And since we're not halfway through 2013 yet, let's assume that in total 150 people will submit valid security holes to FB this year. That's $7.5 million dollars paid out.

Now, once word of a $50k payout gets out, say 10x the number of people try to find vulnerabilities, and the success rate increases linearly. So Facebook pays $75 million a year.

What are the benefits of this program? I'd say you get a few major benefits vs. the current situation: 1. You will definitely convert some blackhats away from exploiting FB data in exchange for $50k legally obtained 2. You convert a lot of people currently looking for security exploits in Google, Amazon, etc... to searching for FB vulnerabilities. 3. As a result you have a much more secure platform. 4. You can leverage these payments through media and PR to legitimately show that you care about security. 5. You combat competitors by touting a more secure platform.

$75 million is not small change when you look at FB's operating income, but it's not going to break the bank either.

The point is that it may well be a rational decision on FB's part to offer significantly more and it has nothing to do with the black hat market value of the exploit.


This assumes a linear relationship between (Bounty for vulnerabilities) :: (Identified outstanding vulnerabilities)

Presumably, one would see diminishing returns in the ability to find "low hanging fruit" exploits, and thus the economics @ a $50k pay-out would be even more attractive for Facebook.


It's a market. As the vulnerabilities become harder to find, Facebook will have to increase the payout to continue to find takers, if they view it as a worthwhile investment (they may decide that since vulnerabilities are harder to find the program has less merit, which I would disagree with).


If it's true that FB execs and employees use the platform for sensitive business communications then maybe $50K isn't so much?


I agree :) ... last check weren't they doing 22 mil$ per day in ad rev? 3.5 days covers 150 exploits


I posted a more detailed description of the exploit: http://www.reddit.com/r/netsec/comments/1fe9mj/facebook_pays...


$4500 sounds about right for this vulnerability.


Sounds pretty generous, actually. They could have given nothing and he would have simply written a slightly more angry blog post; like they really care.


Since you're a security pro, why do you think this is the right amount? Because of the ease/difficulty in exploiting it?

How much damage could have been done to FB's credibility if this guy had been malicious?

I suppose it's damned if you do, damned if you don't.


It's just about what Google pays for auth bypass on sensitive servers.

"How much damage could have been done" doesn't really enter into the calculation.

I'm not seeing how anyone's damned here.


Genuine question: how much would have you charged if facebook hired you to perform a pentest or a vuln assessment?


The same as we charge everyone else? In fact, big companies often pay less, not more, since they can buy in bulk.


In what sense? I'm sure you could find people who would pay $20,000 to view anyone's private photos.


If you say so. Where would you find them? There appears to be a functioning market for drive-by clientside RCE. The Interwebs seem to have interpreted this as evidence that there is a market for all kinds of vulnerabilities. I am skeptical that this is the case, and, unlike the RCE markets, I have no evidence to suggest that the website auth bypass markets are real.

I suppose the ability to generate unbounded Facebook likes or Twitter followers is monetizable.


I have no evidence to suggest that the website auth bypass markets are real.

You could definitely contact your local firm of private investigators, insinuate that "I can access anyone's FB account," and make a tidy sum from doing just that.

Playing devil's advocate, admittedly, but there is still a very active market for quite petty-yet-nasty stuff, e.g. land developer wanting to access email accounts of local residents protesting against the plans.

Actually, you want weird? Thanks to my latest "seeking work" post on HN (and "Hacker" appearing in the page's title), I had someone ask me to hack the local authority computers responsible for registering peoples' deaths. This was, apparently, in order to give his funeral director company a leading edge.

Does Matasano never get cranks asking for stuff of a similar nature?


There's a difference between a functioning market for something and your ability to one-off 1:1 contracts for things. Hyperbolic comparison: you can also pay to have someone killed, but random people on the Internet don't have access to a market for putting out hits.


Just to clarify, "drive-by clientside RCE" means remote execution on the browser just by visiting a website? Like one of the recent Java 0-days?


Yes.


You can't just view anyone's private photos. They have to have been logged into facebook (have a session) and then go to your domain with the exploit.


Most people are locked into facebook.

And you could just send them an email, carefully tailored to target them, with a link to your exploit site.


$4.5k is decent. Were you expecting to get rich? There are contests where you can rake in much larger amounts, and you could make a killing forming a penetration team if money is the issue.


Yes, 4,5k is decent. But put that in perspective of type of data and content Facebook exposed of its billion users.

And personally if Facebook's says "no maximum reward" (https://www.facebook.com/whitehat) I was expecting something different for a complete account access exploit.


The $4.5k is intended as a "thank you" card, not a paycheck.


Perhaps they didn't think it could be weaponized, so to speak. Accessing one account at a time isn't as big a deal as systematically accessing many accounts. Even if your exploit could do so, they may have safeguards against such robotic access.


$4.5k is surely a small amount of money if they had employed someone to do this kind of work.

However, since you the work was done for free the author had sunk the time and risked getting $0 for it. Even if the author is inclined to go black hat, there's a lot more (likely illegal) work to be done for author to extract value from it.

As such, $4.5k is a good deal for the author - Facebook offered it knowing that and author accepted it knowing that.

I don't think this remotely reflects Facebook's true value of privacy (although this admittedly may not be high).

Facebook's only potential loss by making this offer is that it may make it slightly less likely for talented people to work externally on white hat exploits.


Hi - I built Facebook's Bug Bounty program with a few other FB folks. There's a couple things I want to add to the conversation about how we look at rewards.

(Also, in 2009 it was just myself and a couple others running our disclosure program. It wasn't even bounties at that point. We'll get you a shirt, you can pretty much just blame me for that.)

1. We don't compete with the bug market, so our rewards will not look like market prices. It's true that "Bad Guys" would pay enormous amounts for a bug. They also pay a premium for the criminal risk being taken, and for the opportunity to exploit it which will theoretically make them a lot of money. However, we're good guys and we don't plan on profiting from bugs.

2. You, the researcher, are safe to post and talk about the vulnerability you found when Facebook is held to the disclosure policy. If your bug is extra-awesome, we'll sometimes send a bunch of reader traffic your way from our bug bounty page. This has shown to be worth a lot to researchers. Several of our bounty hunters have started companies, gotten jobs, became internet famous from this program and value this more than any bounty.

3. We are pretty lenient on what qualifies as a bug, which means we have a higher volume of payments to researchers than you might expect. If a researcher showed amazing skill in finding something that didn't actually turn out to be a bug, we'll probably reward them anyway because we want them to keep trying. We are pretty lenient on duplicates as well. If we see that someone truly discovered a bug independently (and also showed significant skill discovering it) then they'll probably get a reward too. The theory here is that we want more responsible disclosures instead of pissed off researchers.

Overall I don't want to argue with the amount we rewarded here, but show that we're doing a lot of stuff that's benefiting a lot of researchers. We're one of the first companies to launch a bounty program, and most of the researchers you have listed would probably say they think we're doing pretty well. Not too many companies have a bug bounty program, and I'm really proud of ours! :)


What's the point/argument of this article? That 4.5k is not enough money?

(serious question, not hating)


My personal experience. While also pointing out the high amount (66) of exploits that have been found in the past 5 months. And wondering why Facebook is so dependent on these external security disclosures (and pays so little).


Ok thanks.


The answer is in the last two paragraphs of the article, e.g:

> "A nice day's pay, but a paltry fee for pointing out a gaping hole in the security of a social network holding the personal data of over a billion people"


Well, to be honest I didn't know what 'paltry' means. I got that it was negative, but didn't look it up. Now this paragraph is clearer to me.


The other market would pay orders of magnitude more for such an exploit.


Do you know that or are you just assuming that there's an effective black market for all kinds of vulnerabilities, and not just drive-by clientsides?


My bad, I didn't read further. I assumed this was server-side. After reading, 4.5k sounds right from Facebook, and while I'd imagine the other market price to be higher, I don't think it'd be above 3x, much less 10x, without something special (e.g. high-profile user data) accompanying it.


It is serverside (most web app vulnerabilities are). I'm suggesting serversides are worth much less than clientsides.


What I meant is that the user needed to load a flash payload and be logged in properly. The data harvesting happens client-side. The vulnerability itself is server-side, yes, but computers are faster at copying data than engineers are at figuring out what's going wrong. The data you could potentially harvest with an exploit like this, given good planning and enough time to affect a large amount of people, is definitely worth quite a bit of money. This vulnerability could even have helped to make a very convincing phishing attack, which, again, properly executed, leads to very valuable data.

It's not remote execution, but I still think it's valuable.


Stealing a car requires orders of magnitude less money than buying it. Does this mean it is the right way to get a car?


Atoms weigh more than electrons.


Pricing 101: Goal: Get as much as you can from the thing that you're selling.

Purchase Negotiation 101: Goal: Pay as little as possible for the thing you're buying.

To me what's funny about this is that if Facebook didn't pay the author at all, we probably wouldn't be reading this blog post right now.


These bounties should be thought of more as an "award" than a "reward". In this case, the author has received a great honor for his cleverness.


$4.5k for spec work isn't that bad. They could've paid $0.

With that said, I keep my facebook account only for testing applications.


But perhaps now there's a signalling problem. If you find an exploit of a similar nature you can expect a mid-four figure sum from FB. Now if you're feeling less than white-hatty you can wonder how much more you can get elsewhere.

It's easy to say that FB could have paid $0 but if people knew that, it would make "responsible disclosure" even less likely to happen.


"Responsible disclosure" shouldn't be up to a financial incentive. It's up to the law and basic ethics.

The real choice is between a 4.5k bonus or a subpoena. If you want to try the other path, go ahead....


Actually its a legal gray area. The prosecution would have to prove that you knew that you were selling to known criminals. This means you would have to sell to someone that literally told you what they are going to use it for.


Interesting. I think we can agree the alternative is the wrong choice, though.

Shouldn't have assumed the law was sound in this area... Given that it's not usually sound in any..


I've been paid as part of their white hat program, $4.5k doesnt seem that high compared to what i was paid for my 'exploit'. I picked up $1k for finding a bug in their event invitations, a user could add friends to a private event when using a mobile phone.

The fact that his exploit was getting actual private data from people, i'd of thought he'd of been paid more. Thats not to say $4.5k isnt a lot of money, its a very nice reward.


I think many of the comments are just saying 'do not work for free. if you work for free, do not expect big rewards'. I have to agree with that. So, OP don't take it personally. All good work on your part but do not expect too much for free work. That's the way the world rolls.


What is the actual black market value of an exploit like this? Is it in the realm of $4,500?


I think it's tricky to assess. There's a lot of support for the idea of high black-market prices for drive-by clientside remote code execution. Those vulnerabilities have a long half-life (because of the latency of patching) and maximal value to attackers (collect zombies, snarf payment card information, &c). Neither is true of website vulnerabilities.

For a vulnerability that can be instantly eradicated, everywhere, as soon as the target finds out it exists? There might not be much of a market at all.

(I don't know).


I think if you market this as a way into any private account, think celebrities, politicians, etc. You could catch quite a price if you wait for the right buyer.

Think about what shady gossip newspapers pay paparazzi for breaking laws to get intimate pictures. In the right hands this exploit could be worth tons.


If you say so. I'm sure somebody will buy anything. But the idea that there's an easily-tapped market for Facebook vulnerabilities, similar to the one for browser vulnerabilities? Like I said, I don't know. But I'm skeptical.


Not really a "if you say so" case. There's been many celebrity photo sales exceeding $100K. Selling the fruits of the hack could easily net someone a lot of money, especially if they rarely used it and thus went undetected.

But that assumes the hacker is going to be his own fence. An arms dealer doesn't sell a gun on the basis of "yeah but if you jack 10 nice cars you'll make over $25K, so the gun is at least worth $4K".

Nonetheless, it just _feels_ that $4500 is a small amount, coming from such a large company. What is the downside of FB offering a higher reward, like, say, $20K? Or would the same argument apply, $20K is nothing compared to the millions tabloids would pay for a photo?


It's not a small amount relative to bounties from other big sites. These bounties are not representative of what companies like Facebook and Google spend on software security; they're thanking people who would be doing this kind of inspection anyways.

It's also not the case that this vulnerability == celebrity photos.


How would you sell the exploit? Your buyer would want to verify it works before buying, but then you've already given it away. It's not like you could enforce a contract on this.


You don't sell the exploit, you sell the hack. You could sell someone private access to all facebook data of one person. Proving you have the ability is easy, you can just show one image that is supposed to be private, and promise the rest.


Create a private account, and get the other person to run the exploit?


Here's some estimates based mostly on OS, not site. http://www.forbes.com/sites/andygreenberg/2012/03/23/shoppin...


I'm not sure about the law in these situations. What happens if a hacker (right now let's not label them black or white hat just yet) decides to tell Facebook that hey know of an exploit which can gain access to anyone's account given an email address. This is a monumental bug, obviously. Is the hacker allowed to negotiate a price and then withhold the info if his demands are not met? Or is this against the law?

I would guess as long as they doesn't use the exploit or sell it to some 3rd party (nefarious or otherwise) then they should be safe, right?


Nothing happens. He has the option of doing that. He can probably even use it as a negotiating tactic; without "force, violence, or fear", extortion probably doesn't come into the picture†, and my (IANAL) understanding is that courts read the "fear" thing narrowly, like, "fear or force or violence".

What he probably can't do is threaten to sell the vulnerability to "black hats"; you can't threaten to commit a crime unless you're paid a protection fee.

Source: model jury instructions


It's very simple. If you had a certain price in mind, negotiate the price first, before delivering the service.


IANAL, but saying "pay me because I have a 0day exploit against your business" is often construed as extortion. You can end up in jail for that. Consult your lawyer before negotiating like this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: