Churchill: "Madam, would you sleep with me for five million pounds?"
Socialite: "My goodness, Mr. Churchill... Well, I suppose...
we would have to discuss terms, of course... "
Churchill: "Would you sleep with me for five pounds?"
Socialite: "Mr. Churchill, what kind of woman do you
think I am?!"
Churchill: "Madam, we've already established that.
Now we are haggling about the price”
The purpose of reward schemes is to reward white hats, not to compete with the bad guys for the black hat discoveries.
> The purpose of reward schemes is to reward white hats, not to compete with the bad guys for the black hat discoveries.
How many other industries are there where individuals with valuable skills routinely volunteer to help multi-billion dollar corporations despite no guarantee of reward?
These people are doing work that typically warrants a six-figure salary or several hundred dollars per hour, and they're doing it almost entirely because it's the right thing to do. And Facebook should reward them well enough that they'll continue doing it, not only because it's good for the security of Facebook and its users but because it's the right thing to do.
I don't know how much Facebook has given to the other 65 people who disclosed exploits this year, but it will be innocent users who suffer most if they all share Yvo's sentiment.
He spent a day finding the exploit, and got $4500. That scales quite well if he can keep it up.
At the point the exploits are harder to find, Facebook can make a decision as to whether it's important to keep searching as hard, and raise or lower the price as they see fit. This is the market in action.
He might have made more on the black market, but why it would be worth more is important. On the black market, the transaction comes with legal risk. Risk increases payout (by reducing supply).
That sentence is such a false dichotomy it's hard for me to take you seriously.
Is it evil to want to feel appreciated for your work? The message facebook is sending is that they honestly don't care if you find huge, potentially costly security holes in their software and go out of your way to let them no.
edit: It seems from reading other comments $4500 is actually quite reasonable. I was basing my comment on the author saying it was a "paltry fee".
You take a huge risk even notifying companies of a security flaw you found, since that usually implies you were doing unauthorized penetration testing and they'd have a case against you under the oh-so-wonderful CFAA. Or they'll just ignore you completely and never patch the flaw.
If they value responsible disclosure, they should pay enough to make it worth the effort. They need not compete with black market prices.
There are more incentives than money. Some people like the thrill, some like the "thanks!" from the person they helped, some just like fucking around with web security.
Facebook is abusing the good will of white hats by offering such trivial sums, and they're reducing the security of their platform in the process. They have how many $100k+ engineers who couldn't find this? And how much does the average security breach cost per record, $100-$200? This exploit alone could have exposed them to millions in losses at that cost.
This is what turns white hats into black hats, and I wouldn't blame the guy for selling his next exploit rather than disclosing it. A famous guy once said "we create our own demons". And then the guy in Iron Man 3 said it. And now I'm saying it.
And, because you think the thank-you Facebook offered was too low, you wouldn't blame him for selling vulnerabilities to criminals? Really? Selling vulnerabilities to criminals is itself a crime.
Black hat markets, presumably. At least that is the point being made by commenters here. Granted, selling the vulnerability is illegal and immoral, but that doesn't stop it from happening. The 'market rate' for vulnerabilities seems to be higher than what Facebook and Google are paying out.
This makes sense economically to me. In order for it to be worthwhile for a vulnerability discoverer to sell the exploit, the reward should overcome the cost. In this case, the cost is the probability of getting caught multiplied by the severity of the punishment.
Really? That's exactly what you're doing with Facebook - selling a vulnerability to them, which they then pay you for. So, disclosing to some third party ought also to be fine. The morality or otherwise is up to you though, I guess...
EDIT - I just read @tptacek's reply below. I guess that selling to known criminals, with the knowledge they would use the exploit to commit a crime, _is_ going to be illegal most places.
'daeken seems to disagree with you. Is he correct, or is the "to [proven] criminals" in your statement important?
Selling vulnerabilities to people you know to be criminals, or to people a prosecutor can convince a jury a reasonable person would have known to be criminal, probably is a crime.
Long before you disclosed the vulnerability in Onity hotel locks to the public, the startup you had co-founded "licensed" the same flaw to Lockmasters Security Institute, a company that trains law enforcement agencies, special ops, and intelligence agencies in covert entry techniques.
You gave LSI's government customers a pretty big head start before you bothered to disclose that flaw to the general public.
daeken's work on hotel locks got a lot of press, but the fact that he had two years earlier sold that info to a company for "law enforcement purposes" hasn't gotten nearly the press attention it deserves.
Martin Muench and Chaouki Bekrar have openly embraced what they do. As much as I dislike the path they follow, I have to at least respect them for being up front about the business they're in.
If you're going to help governments covertly break into people's homes, computers, and smartphones, you should wear it with pride.
With your response you make the problem even worse, by pointing out other people who made decisions that you, Chris Soghoian, don't approve of but who you "at least respect".
I also think it's a little laughable to suggest that Cody in any way enabled the USG to break into hotel rooms, as if that was a capability they were just champing at the bit to buy from someone like Cody rather than something they've been able to trivially accomplish for the last 200 years.
I've worked with Cody in the past, consider him a friend (despite his different stance on vulnerability market), and have a problem with comments that chime in on threads for the sole purpose of trying to take him down a peg.
In 2010, we (the startup I was running with friends at the time, UPM) decided to license the opening technology to a locksmithing company for law enforcement purposes.
If it is "laughable to suggest that Cody in any way enabled the USG to break into hotel rooms", then why would he describe the sale as "for law enforcement purposes"?
As someone mentioned above, apparently this is the same amount of money that Google offers for security exploits. How is it any different?
Also how much money do you expect? You should expect NO money for doing the right thing. Rewards like these are simply gestures of goodwill and you should thank Facebook for even offering something like that.
Giving such exploits to the black market is WRONG. It's morally unjust. Money shouldn't change the situation any more.
But at the same time, the reality is that we're in somewhat of a security crisis. Businesses responsible for the security and privacy of our personal information and identities are clearly not in fact capable of protecting those on their own. (In part because it is a very hard task). They need help. And they're not going to get enough help purely from unpaid volunteers.
In a more reasonable world, the government would have armies of 'white hat' hackers trying to find security holes (they surely do), and then _telling the effected about them_ (they definitely don't, they keep them instead for their own use). Because that would result in increased security for us all, isn't that in theory the mission of police agencies, increasing our security?
Perhaps this sort of governmental 'help' would be possible in the future, but right now it seems that such behaviour would almost certainly result in cries that we had entered into a police state. :/
Edit: he notes that when he says 'we', you and I are almost certainly not in the same nation state. :P shrug
They just keep the exploits to themselves, to use for their own spying -- instead of sharing them with the public so they can be patched, to increase security for everyone.
Which course of action is more of a 'police state'?
: “You’re commuting to where the information is stored and extracting the information from the adversaries’ network. We are the best at doing it. Period.” -- Michael Hayden, who headed the NSA, and later the CIA, under Bush.
you get fined for hosting ogv videos because they're deemed unsafe and the FCC decides you should be fined per video that is not an MPEG format... after all these unsafe video's hurt people real people
So, yeah the government could be a good vehicle for helping to secure our internet, but it could also have unintended consequences of preventing growth and innovation under the guise of "security" e.g "fear"
I'm not surprised that you feel this wasn't enough since it appears the reward is your motivation for finding an exploit in the first place. The reward shouldn't be viewed as payment for services rendered, but rather as a gesture of good will for performing your duty of responsible disclosure.
Without the disclosure of whitehat hackers, like I did, these exploits can also become available to dubious parties who could wreak (digital) havoc.
If you are truly a white hat then you aren't motivated by money. You are principally motivated by wanting to make the world a better, safer place. Since you're well-versed in computer security, that motivation will propel you to apply that knowledge to the web (or really to any vulnerability) when the opportunity presents itself.
If the reward is your motivation then at best this makes you a grey hat. You did the right thing, but for the wrong reasons. Note that for the following I'm not saying you personally will necessarily do this; but it would be easy from that position to start treating the group of people who you're willing to disclose a vulnerability to as a marketplace. Facebook will pay $4,500? Fair enough. L33tBotNetHaxz0r will pay $20,000? Done.
If you're someone who possesses the skill to discover security vulnerabilities, I think it's important to think through what your real motivations are. Is it for the rush capturing the flag? Is it for the money? Is it to help people? Your actions can affect many, so act wisely.
Edit: I should clarify that I mean that money can't be the principal motivation for a white hat hacker. I believe it's fine as a secondary motivation. I give an example in another comment below.
He has no duty. Facebook engineers do, but it was him who found the bug. My understanding of the article is that he doesn't care how much money he gets from Facebook.
The 4.5K is representative of how much they care about customer privacy. That's what, I think, his only point is.
If he is going to use the white hat label, which he does, then he does in fact have that duty. If he chooses to hide the vulnerability or sell the knowledge and/or exploit to another party, that's his choice of course, but then he cannot use the white hat label.
I understand that's the point he's trying to make, but I respectfully disagree since as previously stated I view the reward as a gesture of good will and not as a payment for services rendered.
This definition falls short. For example, the hacker who spends time searching for a vulnerability with the intent to do harm, but fails to find one. The hacker has done no harm, therefore by your definition he or she is a white hat despite the fact that they would do harm given the opportunity. For that reason, a person's motivations need to be taken into account in order to provide a proper assessment.
> You are principally motivated by wanting to make the world a better, safer place.
That's not contradictory. And neither is wanting to disclose a vulnerability responsibly, wanting Facebook to be secure against the vulnerability you just found (even only because it affects you, your friends, family, people you care about) and at the same time feeling that $4,500 is a bit of a meagre bounty for a find of some particular severity.
You can argue about whether $4,500 is, or is not, reasonable. But as you may notice, it's not like he held the bug "hostage" threatening to sell it to the black market otherwise, or even hinting at perhaps doing that the next time.
He can be wanting to make the world a better place (or Facebook a safer site), and still be dissatisfied with the reward he got.
Even rationally: for instance, one could reason that even though they could sell the bug for $20k on the black market, this could cause damage and harm to innocent people on a scale (and with a certainty, because they won't buy the bug to just sit on it) that isn't worth the extra money he could gain (and for people with a decent moral compass that threshold is passed easily without even having to consider the risks of dealing with the black market etc). Then, having come to this conclusion, one could say "Well, I did the right thing, and I am disappointed that Facebook isn't offering more incentive for others to do the same".
Also, the incentive is not just for convincing people to give the bug to the good guys instead of the bad guys. It's clear from the article that he spent a deliberate and non-trivial amount of effort on finding this bug, he didn't just stumble across it. This Facebook bug was found because they offered a mystery prize, prompting him to search for a bug and dig deep, then he got disappointed that the prize was less than he anticipated. It's not so much a question of "will he sell it to the bad guys next", rather it's "will he bother going through that much trouble again to help fix a flaw in Facebook, knowing what the payoff will be?"
That said, my question: is $4,500 really too little? How much hours did this cost him (times a price appropriate to his specialist knowledge, of course), so how much should it have been, for doing the right thing? 2x, 3x this? I don't think straight up matching it up to black market prices really makes a lot of sense, there's too many external factors that make it a very different deal.
I don't think that's the right way to look at it—it's not a straight manual labor thing ($/hour * hours = $). The question is, how much money is it worth to Facebook to fix a potentially embarrassing vulnerability. For a complete profile exposure type bug, I would expect that's more than $4.5K.
As a hypothetical example, let's say there are two parties in which it would be equally responsible to disclose a vulnerability to (perhaps they jointly developed the product or service) and as a condition of payment both separately require that the vulnerability not be disclosed elsewhere, theoretically including the other party (hopefully that wouldn't be the case in reality, but again, this is hypothetical). Now let's say party one offers $4,500 and party two offers $20,000. The choice of either party is ethical on the grounds that both have been defined as equally responsible recipients of the disclosure. Therefore a determining factor is still needed, in which case, the amount of money offered is a perfectly reasonable criterion.
So in short, I agree with you on that. I may not have been completely clear in my original post.
Says who? People who would never commit a criminal cyber-act are still allowed to care about money.
Taking the view that selling to blackhats is ALWAYS wrong, it may still make sense for Facebook to pay significantly more to find vulnerabilities in their system. A less vulnerable system is one with a competitive advantage, and I think Facebook is missing an opportunity to tout their security credentials.
Let's take a back of the envelope calculation. Say instead of $4,500, they paid each of the 66 people who submitted a vulnerability $50,000. And since we're not halfway through 2013 yet, let's assume that in total 150 people will submit valid security holes to FB this year. That's $7.5 million dollars paid out.
Now, once word of a $50k payout gets out, say 10x the number of people try to find vulnerabilities, and the success rate increases linearly. So Facebook pays $75 million a year.
What are the benefits of this program? I'd say you get a few major benefits vs. the current situation:
1. You will definitely convert some blackhats away from exploiting FB data in exchange for $50k legally obtained
2. You convert a lot of people currently looking for security exploits in Google, Amazon, etc... to searching for FB vulnerabilities.
3. As a result you have a much more secure platform.
4. You can leverage these payments through media and PR to legitimately show that you care about security.
5. You combat competitors by touting a more secure platform.
$75 million is not small change when you look at FB's operating income, but it's not going to break the bank either.
The point is that it may well be a rational decision on FB's part to offer significantly more and it has nothing to do with the black hat market value of the exploit.
Presumably, one would see diminishing returns in the ability to find "low hanging fruit" exploits, and thus the economics @ a $50k pay-out would be even more attractive for Facebook.
How much damage could have been done to FB's credibility if this guy had been malicious?
I suppose it's damned if you do, damned if you don't.
"How much damage could have been done" doesn't really enter into the calculation.
I'm not seeing how anyone's damned here.
I suppose the ability to generate unbounded Facebook likes or Twitter followers is monetizable.
You could definitely contact your local firm of private investigators, insinuate that "I can access anyone's FB account," and make a tidy sum from doing just that.
Playing devil's advocate, admittedly, but there is still a very active market for quite petty-yet-nasty stuff, e.g. land developer wanting to access email accounts of local residents protesting against the plans.
Actually, you want weird? Thanks to my latest "seeking work" post on HN (and "Hacker" appearing in the page's title), I had someone ask me to hack the local authority computers responsible for registering peoples' deaths. This was, apparently, in order to give his funeral director company a leading edge.
Does Matasano never get cranks asking for stuff of a similar nature?
And you could just send them an email, carefully tailored to target them, with a link to your exploit site.
And personally if Facebook's says "no maximum reward" (https://www.facebook.com/whitehat) I was expecting something different for a complete account access exploit.
However, since you the work was done for free the author had sunk the time and risked getting $0 for it. Even if the author is inclined to go black hat, there's a lot more (likely illegal) work to be done for author to extract value from it.
As such, $4.5k is a good deal for the author - Facebook offered it knowing that and author accepted it knowing that.
I don't think this remotely reflects Facebook's true value of privacy (although this admittedly may not be high).
Facebook's only potential loss by making this offer is that it may make it slightly less likely for talented people to work externally on white hat exploits.
(Also, in 2009 it was just myself and a couple others running our disclosure program. It wasn't even bounties at that point. We'll get you a shirt, you can pretty much just blame me for that.)
1. We don't compete with the bug market, so our rewards will not look like market prices. It's true that "Bad Guys" would pay enormous amounts for a bug. They also pay a premium for the criminal risk being taken, and for the opportunity to exploit it which will theoretically make them a lot of money. However, we're good guys and we don't plan on profiting from bugs.
2. You, the researcher, are safe to post and talk about the vulnerability you found when Facebook is held to the disclosure policy. If your bug is extra-awesome, we'll sometimes send a bunch of reader traffic your way from our bug bounty page. This has shown to be worth a lot to researchers. Several of our bounty hunters have started companies, gotten jobs, became internet famous from this program and value this more than any bounty.
3. We are pretty lenient on what qualifies as a bug, which means we have a higher volume of payments to researchers than you might expect. If a researcher showed amazing skill in finding something that didn't actually turn out to be a bug, we'll probably reward them anyway because we want them to keep trying. We are pretty lenient on duplicates as well. If we see that someone truly discovered a bug independently (and also showed significant skill discovering it) then they'll probably get a reward too. The theory here is that we want more responsible disclosures instead of pissed off researchers.
Overall I don't want to argue with the amount we rewarded here, but show that we're doing a lot of stuff that's benefiting a lot of researchers. We're one of the first companies to launch a bounty program, and most of the researchers you have listed would probably say they think we're doing pretty well. Not too many companies have a bug bounty program, and I'm really proud of ours! :)
(serious question, not hating)
> "A nice day's pay, but a paltry fee for pointing out a gaping hole in the security of a social network holding the personal data of over a billion people"
It's not remote execution, but I still think it's valuable.
Purchase Negotiation 101:
Goal: Pay as little as possible for the thing you're buying.
To me what's funny about this is that if Facebook didn't pay the author at all, we probably wouldn't be reading this blog post right now.
With that said, I keep my facebook account only for testing applications.
It's easy to say that FB could have paid $0 but if people knew that, it would make "responsible disclosure" even less likely to happen.
The real choice is between a 4.5k bonus or a subpoena. If you want to try the other path, go ahead....
Shouldn't have assumed the law was sound in this area... Given that it's not usually sound in any..
The fact that his exploit was getting actual private data from people, i'd of thought he'd of been paid more. Thats not to say $4.5k isnt a lot of money, its a very nice reward.
For a vulnerability that can be instantly eradicated, everywhere, as soon as the target finds out it exists? There might not be much of a market at all.
(I don't know).
Think about what shady gossip newspapers pay paparazzi for breaking laws to get intimate pictures. In the right hands this exploit could be worth tons.
But that assumes the hacker is going to be his own fence. An arms dealer doesn't sell a gun on the basis of "yeah but if you jack 10 nice cars you'll make over $25K, so the gun is at least worth $4K".
Nonetheless, it just _feels_ that $4500 is a small amount, coming from such a large company. What is the downside of FB offering a higher reward, like, say, $20K? Or would the same argument apply, $20K is nothing compared to the millions tabloids would pay for a photo?
It's also not the case that this vulnerability == celebrity photos.
I would guess as long as they doesn't use the exploit or sell it to some 3rd party (nefarious or otherwise) then they should be safe, right?
What he probably can't do is threaten to sell the vulnerability to "black hats"; you can't threaten to commit a crime unless you're paid a protection fee.
† Source: model jury instructions