Looks like Tor hidden services are now broken to me...
 What's to stop Facebook from brute forcing a key for any of the existing hidden services?
[edit2] If Facebook can brute force keys like this, so can the NSA and GCHQ. Tor hidden services are officially broken.
[edit3] A colleague of mine suggested that this might be simply Facebooks way of making it public knowledge that Tor hidden services can no longer be relied upon.
[edit4] Facebook are saying (on the Tor Talk list) that they generated a load of keys starting "facebook" and then just picked the one which looked most memorable, and were extremely lucky to get such a good one:
Meanwhile, whilst I applaud Facebook going above and beyond here, this doesn't set a good precedent.
Firstly, Onion service are very slow. There is no need to pay this cost for a service whose ownership is not actually hidden. If the Tor project made it easier to reliably identify traffic from Tor exit nodes, Facebook could apply whatever rules they wanted to Tor traffic without needing to slow things down for everyone.
Secondly, by doing this, there's now a risk that other firms who want to be on the cutting edge of privacy will try to copycat this approach, even though it makes no sense and is very complex and expensive to set up. Worse, users might think it's some kind of "gold standard".
Thirdly, it doesn't actually solve any of the reasons why Tor traffic is routinely discriminated against and harassed: Tor is effectively a "bulletproof ISP" that shields a lot of abuse and hacking. Merely making a Tor hidden service specifically for Facebook doesn't solve that, at all.
That may be the worst unexpected consequence. Once onion services become more mainstream, I fear FB's example - no matter how well intentioned - will turn into an engineering nightmare. After all, after one of the best known online brands does something clever, you can expect copycats to follow.
First, we'll get clueless PHB types demanding long vanity names.
Second, some services will happen upon neat onion addresses and ride the wave. Their very existence will act as a goal post for the others.
Third, a vocal segment of users becomes accustomed to seeing "perfect" vanity names. After all, such name means that the entity behind the name has enough resources to actually get a proper vanity name.
Amidst all that, somewhere between stages #2 and #3, we will see (horribly misguided) vanity onion service name markets. A bit like domain name squatting from the late 90's, but with far worse consequences: at least with domain names the only thing transferred was the control over the DNS entry. Because onion service names are directly mapped to the private keys, selling a $VANITY.onion address is the same as selling a copy of the private key.
Caveat emptor, indeed.
The list of "you have to" involved in making its use meaningful is pretty long, and "normal" users aren't good at caring about details like that.
Please nobody construe that as a dismissal of the needs of those users, that isn't what I'm getting at.
Right now? No. But in a near future? Possibly yes, if Firefox decides to include a built-in Tor client. 
Even if the article (and the slashdot thread I lifted it from) were vapourware right now, the idea clearly has been floated enough to make it an attractive option.
How do you differentiate between this and account takeovers though? If someone constantly lives in from California, and then someone elsegets their Gmail password and logs in through Tor, can you know it's actually a compromised account and not the registrant just deciding to use Tor?
Still, you have to admit it kind of defeats the whole purpose of Tor if a user must use their real IP and/or give up their mobile phone number in order to use it. I agree with the restrictions, but it kind of makes me wonder why a legitimate Tor user would bother to use Gmail in this case.
But yes it's a very specialised use case. Most people won't care. Fully anonymous accounts will require something like Bitcoin proof of sacrifice or multi jurisdictional identity escrow. But the problem for those is that there's far more demand from abusive users than legitimate users, so it's hard to make a business case for providing such support.
Ranting, but to this day, I still can't use google search reliably over TOR, meanwhile the spammers and adversarial people who scour Google's results for whatever purpose have almost certainly hacked the captcha, otherwise they would stop their grokking of the SERPs and rethink their retrieval methods, like get a channel bank of dial up modems, redialing once they get to a captcha. On certain TOR exits there is no captcha, so I assume they're probably new nodes or light traffic nodes. The captcha block page even says this: your computer is sending us a lot of queries. Here's one such unblocked node. It's in the ukraine, ~10MBPS, uptime of 31 days. It's about middle of the road in terms of bandwidth and has a somewhat short uptime.
In my spare time, I run an exit node, but what I'm really doing is using my switches mirror port into my WAN connection to capture 100% of all TOR traffic for SEO purposes, and to find what and how people are hacking, etc....
It's ironic that google fights for the right to index everyone's content [see germany news et al] but the reverse? Oh no, no way! The SEOs will figure out our secret sauce and use Google to steal the best ranking content. And yes, I realize Google's guidelines are for the good of the web, I'm just ranting.
As long as you don't have to use it, I don't see why offering users a choice is a problem.
>If the Tor project made it easier to reliably identify traffic from Tor exit nodes, Facebook could apply whatever rules they wanted to Tor traffic without needing to slow things down for everyone.
ExoneraTor does this quite well in my opinion, and I don't follow how/why you think this will slow things down for everyone. Surely you're not referring to the entire Tor network?
The main value in using this, in my opinion, is reducing the potential attack surface associated with MITM attacks--including CDNs--after your traffic exits Tor. Attacks on Facebook users involving Akamai have been documented by NSA, for example; Facebook is a PRISM partner, but this would arguably still stack the deck in favor of "going through the front door" to access Facebook user data.
Tor's current support for detecting usage is patchy. You can't query a random third party website for every login for a system like Facebook, so you need a list of IPs that can be refreshed quickly. But such lists tend to be incomplete or behind e.g. the "exit" flag doesn't mean what you'd intuitively expect, so it's sometimes possible for Tor traffic to turn up from an IP that is not identified as an exit.
Re: MITM security. Even if Facebook got lucky here, we're talking about an 80 bit identifier and brute forcing these has been demonstrated before, I believe. I'm not sure this is much of an upgrade over just regular SSL CA + HSTS pinning.
(edit: last paragraph)
my take on the post was that it was presented as an option, and for users taking the time to access a site via tor, speed may not be the only (or even primary) consideration. i say that as someone who does ~95% of my browsing--both work and personal--via Tor.
> But such lists tend to be incomplete or behind e.g. the "exit" flag doesn't mean what you'd intuitively expect, so it's sometimes possible for Tor traffic to turn up from an IP that is not identified as an exit.
Doesn't the onion address solve this problem?
> I'm not sure this is much of an upgrade over just regular SSL CA + HSTS pinning.
Depends on your threat model, but I think it's a useful option and congratulate the Facebook team for offering it to users. I'd love to see Google, Twitter, and others start to compete on the extent to which they support TBB users.
This seems to me like a situation where the illusion of anonymity might be worse than the reality of non-anonymity. Granted, it lets you bypass censorship. Granted, if you are very careful and, say, only use your Tor Browser to connect to Facebook and nothing else whatsoever... maybe. I just don't think I could trust myself to do it properly. And while I'm no Edward Snowden, I'm also not dumb.
DuckDuckGo is being irrational? Keybase? Riseup? Sorry, but I don't see how knowing the owner of a service can immediately disqualify the service from having the privilege of a hidden service. The only thing limiting Facebook is the stress that may be placed on so many connections at once on a single onion. It should not cost too much out of actual resources to run one single gateway, especially when Facebook is a very successful business and can afford to invest in something that may expand their audience even more.
I'm not saying anything for or against Facebook's ethics in general; I am a little hesitant given their history to trust them at all, but this is admittedly a huge step in shedding light on what Tor actually is. It's not just for drugs or child pornography; it can actually be used by the everyday Joe wanting to check on his friends and see if there's a party nearby. What people seem to miss is that Tor provides IP anonymity. Yes, encouraging users to be more careful about their browsing habits is a good thing, but if people want to give out their personal information, in the end that cannot be stopped. If the website owner wants to disclose his or her identity, that cannot be stopped. Tor provides anonymity of IP addresses and nothing else.
There just isn't any really solid technical reason to do this. The closest I saw was something like "newspaper dropbox wants to force people to use Tor and an onion address is the easiest way to do that", but again, they could just identify traffic from exit nodes and block anything that isn't coming from there, if there were better tools for it.
If you use Tor, what's the point in providing a phone number. All that does is give Google data that can be subpoena'd by the feds. :(
They are therefore trying to brute force the first 11 characters of the address. The author of scallion estimates one can achieve 520 MH/s with a AMD Radeon HD5770 GPU  which retails for $190 , they then give the formula for calculating the time (in seconds) to have a 50% chance of finding a matching URL:
2^(5length-1) / hashspeed
Which with a length of 11 and a hashspeed of 520M, would take about a year with the one GPU .
The total cost of the hardware to have a 50% chance of finding the vanity address "facebookwww?.onion" within a week would therefore be around $11 000 .
Imho, this is well within the realms of possibility for a company as large as Facebook and does not suggest a weakness in the .onion scheme.
Dustcore is very correct, correcting for my initial mistake, it would take 1.1 million years on a single GPU using scallion. Finding that sort of result in a month would require $2.6 billion worth of GPUs. Now I know facebook is known for spending billions on questionable purchases, but this would be a bit extreme even for them. How the hell have they managed this?
Now, the other, more important for me observation is, that reportedly the TLS certificate is actually worth close to nothing, and giving false security, as a HNer claims to have got a valid cert issued for this very same facebook's .onion address already: https://news.ycombinator.com/item?id=8539066 -- if I understand correctly, cert issuers seem to happily accept any .onion URLs in "alternative addresses" in SSL certs without any verification. Anybody else could confirm/deny?
Which is why using TLS on top of the .onion address is brilliant: even if the secret key for the .onion address is compromised, the TLS certificate (which is rotated more often) will keep the connection safe. The worst that could happen would be someone hijacking the .onion address, but that would lead only to a DoS instead of the compromise that would happen without the redundant TLS layer.
And the certificate also helps validate that the .onion address is really from facebook: as someone observed elsewhere in this discussion, the certificate is also valid for the non-.onion addresses, so just examining its alternate names extension is enough to prove that the certificate owner could also get a valid certificate for www.facebook.com (meaning the certificate owner is very probably facebook itself).
So someone bruteforcing the .onion key could easily get their own valid SSL cert and have full access to the plaintext for anyone browsing the .onion site over SSL.
The security of facebook over onion is now only protected by the hash power required to brute force the vanity address, instead of the integrity of the SSL CA system or the power required in factoring an SSL key. Even the requirement to spoof DNS or perform actual man-in-the-middle-of-the-wire hijacks has vanished.
Did you ask NSA for a full 16-character bruteforce? :)
Bench marking Shallot on an Intel 3350P@3.10GHz:
time ./shallot ^a -> 0.09 sec user
time ./shallot ^aa -> 0.12 sec user
time ./shallot ^aaa -> 0.12 sec user
time ./shallot ^aaaa -> 0.47 sec user
time ./shallot ^aaaaa -> 5.92 sec user
time ./shallot ^aaaaaa -> 118 sec user
Who knows how much they spent trying to brute force that onion address.
EDIT: Ok looks like they went the backronym route
The .onion URL is created by hashing the public key (and possibly more information), and then it is stored in Tor's database of hidden service descriptors as noted by this. This would indicate to me that if there's a hash conflict, such as the NSA trying to take over FB's .onion URL, the database of hidden service descriptors would reject the duplicate insertion to the database.
IIRC it's 80bit truncated SHA-1, so it's not even close to feasible unless there's a substantial preimage attack against the function (and none are known). It's clearly feasible to find something close enough to the human eye for a phishing/spoofing attack, but that's hardly a problem exclusive to Tor.