$500,000 - Messaging Apps RCE + LPE (SMS/MMS, iMessage, Telegram, WhatsApp, Signal, Facebook, Viber, WeChat) .
Which I guess chat apps with > 250M MAU (except Telegram, which probably is because of it's high usage rate in Arab countries). Looking at their programme, they might probably approach the original vendor; but we all know who would pay hard cash for 0 days ;)
Original Source :
For example, Viber is very popular in the Middle East, Russia, Africa, India, parts of Asia, Ukraine, Balkans, some of the 'Stans.
So why is buying/selling 0 days OK?
Both are selling information that will be used wrongly in the wrong hands, the kind of information you don't want a "broker" to know about, the kind of information you don't want a broker to find clients for.
And if there's only one potential buyer (the target), they're basically black mailing 0 days targets: "become a client or... who knows what will happen... maybe the NSA or hackers will buy it?"
I doubt that. If I had non-public information that could materially impact the price of a company's stock, I would not expect to get away with selling that information to anonymous buyers via some shady broker.
But if someone were to trade on the knowledge that a supplier had delivered faulty airbags to an auto manufacturer without telling that manufacturer, I doubt this would be looked at kindly by the courts.
But I agree that the metaphor is not ideal. Selling 0day exploits to that broker is much worse. Any seller would have to have a reasonable expectation of aiding organised crime or terrorism.
It's an amazing business idea!
- people with insider info can use that info to get rich but without insider trading
- targets of insider info can be coerced into buying the data to prevent selling to third parties
- and of course we can trust the broker to never sell the data to third parties
I'm being a bit sarcastic of course.
>Say a company buys and sells insider trading information... Would it be legal? So why is buying/selling 0 days OK?
Insider trading is a Victorian-esque and arbitrary law. The same way sports ban PEDs under the guise of "fairness" (when in reality everyone uses steroids and PEDs, and most everyone does insider trading) so too do these outdated laws do nothing but put the SEC on a moral high-ground and bolster the tax bucket.
>Both are selling information that will be used wrongly in the wrong hands, the kind of information you don't want a "broker" to know about, the kind of information you don't want a broker to find clients for.
May be used wrongly. Zerodium is a grey hat distributor. They can sell to the companies themselves, domestic and foreign governments, or terrorists. The first three can be used for good (and it could be argued so could the terrorists if their idealogies converge with some oppressed opinions among a populace). It's not 100% and it's likely, but "good" comes out of it.
>And if there's only one potential buyer (the target), they're basically black mailing 0 days targets: "become a client or... who knows what will happen... maybe the NSA or hackers will buy it?"
In a perfect world, you shouldn't have to have a coal tax and companies would care about the environment, but we don't live in this world. We live in a world where the only that matters is asset value, and good security is rarely seen as increasing asset value (only stopping asset decreation from scandals).
I know this is a popular belief, and there are certainly some outright violations in the industry, and then a whole host of activities that carefully toe this line, by and large it's just not true. Most hedge funds and active managers underperform their benchmarks. I've spent my whole career in hedge funds and seen far more clueless money losing portfolio managers than mischievous insider traders...
Brokers like Zerodium will inflate the price of 0 days, for sure. Maybe this will be an incentive for 0 days researchers to dig deeper. At least they'll be richer. But maybe it will cause more harm than good: it's a market place for exploits, and Zerodium will look like a 0day Wallmart to hackers.
"They can sell to the companies themselves, domestic and foreign governments, or terrorists": I consider the only ethical option is the first one.
If you're really concerned about security for consumers, can you tell me how you're trusting "domestic and foreign governments, or terrorists" to protect your security when they're buying exploits?
They're buying exploits, holes in security: they don't buy you more security, it's the exact opposite.
I'm sure companies are much more concerned about security for their customers than any other actor. Why would a third party want access to exploits if not to defeat the security put in place by the company?
0 days researching is essential to plug holes, not to punch holes. Only the target can plug holes, the other "customers" only want to punch holes.
"In a sense, Zerodium is a cyber arms dealer. It pays hackers to learn about their tactics, then packages that and sells it to elite subscribers.
For $500,000 or more a year, governments could buy a road map for hacking Android phones to spy on people. Companies could learn about a special hacking tactic before it's used on their own Windows computers -- or quietly use it themselves for corporate espionage."
So they are selling to governments (which make laws and make this legal).
On another note i think it's silly ppl try to exploit apps on phone when it's well known how to access any device via simcard and/or lte chips >.> can just take screenshots of conversations, no need for decryption. >.> guess they like flushing money through the toilet lol.
If you are in a position to successfully (as in get your code reviewed, accepted, promoted and deployed) backdoor a product that is big enough to qualify for a bug bounty of even a $100,000 you are going to be paid well enough for the risk not to be worth while.
Also such prices are a good indicator of app security, you'd only want apps whose exploits cost north of "pocket money for a state actor", so IMO > 100 mn?!
I'm not saying you can ban market activity perfectly, but legalization certainly makes things easier, lowers prices, and increases activity.
> Also such prices are a good indicator of app security
Are they? Surely the price reflects the value of the exploit?
This would to a much greater extent reflect the user base of the app (high variation) than its relative security (lower variation; though the two would have an interactive effect). The larger the user base of the app the better, though more wealthy and insecure (e.g. rich retirees) would have more value to criminals, where as politically engaged young people would have much more value to governments and spies. I would think that the price would be a pretty noisy/poor indicator of the security of the app, relative to the user base.
I agree partially that the fear of being caught is a disincentive. However, higher prices make less people buy it but more people offer it. In a way, illegal drugs with its high potential profits (due to higher risk) attract more dealers (and less consumers). This again puts pressure on the risk premium, so consumers benefit from more supply. So high prices are not really a solution IMO.
> price reflects the value of the exploit
I think it reflects both the reach (~ number of users weighed by their wealth and gullibility) and the "safety" (an open-source app, audited by some well-known company vs. some prototype closed-source app). Since the reach can be reasonably estimated, the safety can IMO also be estimated (as price / estimated reach). I am pretty sure we should not try to agree on a clear metric but I personally would find that info useful when choosing e.g. a messenger app or some piece of hardware (say a router)
Additionally publicity generates incentive to fix the problem. More apps/OSs/libraries will try harder to be secure. Apps could start wearing high exploit bounties as badges of honor.
Much like how ransomware likely has increased security more and changed user behavior than an infinite amount of suggested security training. Some users even gasp ask about how to protect against ransomware and as a side benefit actually protect against mistakes, dying disks, and other flavors of malware at the same time.
Seems better to trot this kind of stuff out in the open than to hide your head in the sand and try to hide security problems from the public.
Governments on the other hand would pay lots of money to increase their mass surveillance capabilities. Signal users are disproportionately young, sophisticated, and politically engaged.
Given that Signal's budget is raised from donations and grants, and is much more fixed than an open market to undermine it, how would such a market incentivize them to increase funding on security? It's already their top priority.
Are you suggesting that the developers would put in a vulnerability on purpose, in order to sell it and collect the payoff?
Because, short of that, I can't see how exploit trading incentivises weakening of systems. It just incentivises people to find weaknesses.
The extra thing that a free market does is incentivises people to find weaknesses and sell them so that they can be maliciously exploited. When vulnerabilities are exploited instead of patched, secure systems are by definition weaker.
Chrome LCE+SBX on Linux: 80k
If it's not a secret, how much did they pay to you? Seems like this could be considered, at least, a medium severity, and the top bounty they gave so far is only $500. If you don't wanna disclose that, that's perfectly understandable. I'm just curious.
It sounds like it's a privacy leak, but not an RCE/SBX? Especially as it's saying that the sandboxed version of Tor isn't affected.
Bit sad that you can't click the revisions on https://trac.torproject.org/projects/tor/ticket/23044#no2 to see the diff.
I'm thinking it's a combination of being the default, Firefox just having enabled the sandbox on Linux, and it being the base of the Tor Browser (which means Firefox exploits can sometimes be used against Tor too).
Sure, their rate is $1 million an hour :).
I could see the government jump in and pressing charges but the worst thing Whats-app can do is piss them off.
In other words, this can be seen as part of the free market incentivizing people and companies to find and patch exploits, or for programmers to just write safer code in general.
If an exploit is found, company A can play it for what it's worth, either accusing B of negligent behaviour, sloppy coding, defunct bug award program, etc., or exploiting the bug (via anonymously hired help) for a while, possibly in such a way that company B's customers notice the issue before company B does.
If an exploit is not found company A can still accuse company B of negligent behaviour if company B hasn't matched $X in their bug awards.
If company B has done everything right company A can pull the offer, and try to convince a reporter that this indicates an exploit was found and is being used without anyone at company B noticing. Dumber stories certainly have surfaced, and the damage could be significant.
you are right it's a human problem, and the problem is lack of respect for eachother. won't be fixed by any software patches for sure =]