You then have the remote ssh server available locally with no public interfaces, no public ports, and an additional layer of confidentiality and authentication
Awesome, this is what I had in mind for using a stealth way to manage infrastructure.
Thanks for sharing a blueprint as I'd only theorycrafted it and now I have a path to experiment with.
Yes, this is exactly the same way I've been using hidden services for some time. It started as a "stealth gateway" for some of my personal services (e.g. VPN, file server, etc), and grew into something like what you're describing for managing infrastructure. I'm glad to see others doing the same thing. Thanks for sharing!
From a IT Security Assessment, what would those be and their associated risk and impact factors?
One would naturally be the onion address. If they could break the 1024 bit RSA key, they could hijack the name. Risk of this happening: very low. Impact: massive especially outside the realm of tor.
Any additional risks you were thinking about? Backdoors in tor project? Hardware malware specifically designed for tor (rather than than being general)?
I believe the main concern would be vulnerabilities in the Tor software itself.
That said, I don't think there's any reason to believe Tor is any more likely to be vulnerable than any other software in your stack. The project is open source, and is very much written with a focus on security and privacy.
This sounds cool, but it's needlessly complicated. Why not simply configure SSH to use multiple forms of authentication [1]: password + public key auth + 2FA (Duo Security, Google Authenticator, Authy, etc.). That's all you need to achieve a very secure state.
Here's a question I have on this -- I've been eagerly awaiting the new functionality to have onion addresses where using them doesn't reveal their existence. So the address itself becomes a form of shared key.
But this opens another possibility of one-time addresses, and address scalability. My question is, does network cost increase with number of addresses? If peers on the network are using one-time addresses to form circuits, will that scale fine?
Basically I am envisioning two people communicating via dedicated addresses for their own use only. So a single key becomes a network channel to send data to a specific peer. That "peer" could actually be many different devices, but all ultimately connect to the same distributed application with a shared state.
So basically onion addresses are usernames which also let you pipe data to that user, right? How is this not the coolest thing ever?
The issue I've always had with onion addresses is that you can't remember them, which means you need to keep a list of bookmarks saved locally somewhere, which–if you're using tor to avoid prosecution–is pretty incriminating. What's the solution?
edit: to add, I haven't yet completely bought into the new onion services. I like a lot of the security improvements, but it is really going to depend on how name resolution works and how authenticating endpoints will work for users. Running a pluggable name resolution system means we can try out different solutions and see which takes off organically and practically.
Yes indeed the naming issue is important and needs to be addressed. We are currently pretty busy with stuff so we haven't even started implementing the aforementioned proposal (or considered whether it's a good idea).
Keep one bookmark to a community directory? The plausible deniability is that you just visit for the catpicures1685isbis.onion, and not for cocainehookerz3288uiop.onion
In theory, you could achieve this by only storing the password DB in a cloud service you access through the Tor browser, but then you run into two other problems. First, how do you remember the address of the service you're using to store your password DB? If you store that address on your device, you've lost your plausible deniability. I guess you'd have to use a well-known clearnet site you access through Tor (Dropbox, Google Drive, etc).
That leaves another problem though. If you have a password manager installed on your PC, that implies the existence of a password database, which might also cause you to lose your plausible deniability.
Maybe if there was a profiles/password manager feature built into the Tor browser that lets you decrypt and load a profile from a service provider, then destroys all local evidence of that profile when you close the browser? Seems like a feature that could be useful in all browsers actually. A cross-browser password manager and bookmarks sync feature? That'd be really convenient.
This is a remarkably good idea. You can store an onion hash in a blockchain transaction and then use the signature of the key which sent that transaction to verify it.
I assume this is already done, and if not why not?
The practical implementation would be the bookmarking service would authenticate their links against this blockchain, and the user would either have an agent on their system to validate the claims, or another 3rd party who they trusted to validate the claims?
> And finally from the casuals user's PoV, the only thing that changes is that new onions are bigger, tastier and they now look like this: 7fa6xlti5joarlmkuhjaifa47ukgcwz6tfndgax45ocyn4rixm632jid.onion. For more information on the nitty-gritty details, please check out our technical specification.
It's a shame they don't have a description for technical users. I'm more interested than "bigger, tastier, and looks like this", but less interested than 13000 words of specification.
TL;DR of that change is that they've moved from a truncated SHA1 hash to SHA3/ed25519/curve25519. For more detail, here's their own summary from the technical specification:
Here is a list of improvements of this proposal over the legacy hidden services:
a) Better crypto (replaced SHA1/DH/RSA1024 with SHA3/ed25519/curve25519)
b) Improved directory protocol leaking less to directory servers.
c) Improved directory protocol with smaller surface for targeted attacks.
d) Better onion address security against impersonation.
e) More extensible introduction/rendezvous protocol.
f) Offline keys for onion services
g) Advanced client authorization
The problem with a lot of these open source projects is that you're usually so exhausted trying to get the release out you don't have much energy in explaining them! (As these types of things require developer involvement)
But agreed. This is basically offering a paragraph long abstract or read the massive white paper, without much in between.
Being undiscoverable is a big help. For our ancillary services, we're taken to using Tor2Web to auth mode HS servers because some random domain is less likely to look interesting for people to poke at. Publishing a .onion, especially with some general-purpose software hosted on it screams that there is something of interest there.
Gotta be the change you want to see. None of my hidden service .onion sites are anything "of interest". They're just electronics and radio hobby stuff like the web has always had.
Put everything on Tor as a hidden service and eventually the stigma will go away.
This is on the front page now right next to a story about how hostile and ad-ridden most commercial websites have become. Thank you for helping keep the original spirit of the web alive on onion space.
The keypair will be generated and you'll find the onion address in /usr/local/etc/tor/hidden_service/hostname, and it'll forward all traffic to 127.0.0.1:8080.
You can run as many hidden services as you like on the same tor instance, with different onion addresses, and forwarding to different places.
How have you found long term maintenance of running these services? ie, running into random breakage and spending time debugging the system vs an nginx box or something... Is it set-it-and-forget-it type of system?
I've always wanted to set up onion addresses but I'm always wary to open up a new bag of worms for my personal projects.
I used to use Tor to bypass censorship on pr0n in my country and ended up trying to run a hidden service for fun. My biggest peeve with was the domain name. I mean, sure I understand why it isn't human readable but then there are so many ways to counter that. We've got the blockchain and we have the IPFS way to handle these things too.
I'm hoping at some point blockchain DNS systems are adopted by mainstream (or niche in case of Tor) vendors. It would make it so much much easier to name onions.
In Qatar, which has a national firewall, I was able to use Tor to browse torrent sites, and then use a regular bittorrent client to fetch the contents of the magnet links (encrypted connections only). That worked great for the few years that I was there. The Qatari internet was a fair bit faster than what I am used to from Canada, about 30 - 50 MB/s at max torrent.
I have mixed feelings about Tor. As a proxy to hide your IP address it makes perfect sense to me.
But what's the end result of hidden services?
I want to be anonymous sometimes but I can't think of a time when I want the host of a service I use to be anonymous.
In most situations their identity is actually important to me. I want to know the source of news, to trust that I'm sending a message to the right person, to trust that I'm not relying on a site run by some kid in her parents basement.
And I know that legitimate sites can get certs for onions these days (like facebook)... But doesn't that defeat the original purpose of running a hidden service if the purpose is to hide the owner?
Tor hides the identity of the service provider to third-party eavesdroppers. You might know someone and have gotten the onion address of a file upload server from them, but not want the government or your ISP to know who you're talking to.
It also hides the a service provider's physical location (well, their IP address and hence location in the network graph) from even a user that knows their identity. You might know that I am an investigative journalist you're trying to send information to - or, given the origin or Tor, maybe you're a spy and you know I'm your handler - but I don't want to give away my location.
There are also a lot of use cases where the user doesn't care that much about knowing who the service provider is, and the service provider cares a lot about hiding their identity (enough so that they would not provide the service if they could not be anonymous).
For the example of, say, political commentary - often what you care about there is less the real-world identity of a person, and more their persistent identity and reputation. On the other side of the equation, though, in some environments people might not feel safe expressing their opinions at all if those opinions could be traced back to them.
To add to that, one useful application for hidden services is to enable SSH login on machines that are behind some impenetrable NATs/firewalls that you can't open up for inbound connections. Have a machine behind mobile, NAT only internet? Set up a tor hidden service and log in without any problems!
I see this as a valid point. You don't need Tor for that, any local app could proxy a connection that way. But, yes, it is a useful side effect of Tor.
First of all, actually, no, a "local app" can't do that. The problem is that the machine doesn't have a globally reachable address, and that's not something that you can solve by changing the software on the machine, you need some external service that provides you with a globally reachable address and a way to forward connection requests for that address to your machine. That is a service that the Tor network provides.
Obviously, Tor is not the only solution to this problem, but it is one that has the nice property that you don't need to register any accounts, you don't need to pay for anything, the availability of the Tor network is pretty good ...
Also I am not so sure I would call it a side effect, for two reasons:
1. Any application that grows the anonymity set of Tor is useful for the goals of the Tor network. Even if your SSH session does not have any use for the anonymity that Tor provides, it still is good for the Tor network that you add cover traffic to the network for those who need it.
2. Decentralization has a lot of overlap with anonymity, and as such I would consider this use actually to be well within the use cases that Tor is intended for: ISPs build networks that increaslingly make it difficult to use your own devices for anything more than consuming content that is delivered from the network, thus contributing to the growth of the kind of centralized services that don't provide any anonymity at all. Using Tor to access your own machines and thus enabling you to build infrastructure that is not inspected by third parties is very much aligned with the goals of the Tor project.
I care more about security and privacy than anonymity.
IPFS or Keybase seem like better approaches towards those goals. And, yes, Tor makes sense if you need to hide your IP address. But beyond that I don't see much further use.
I don't think many people care about anonymity as a primary value. Anonymity is simply one tool to achieve privacy and security against certain types of risks/attacks.
There is another purpose, because when you access a hidden service, you are not using an (unknown, untrustable) exit node – instead, your data remains entirely encrypted until it reaches the owner of the address.
Reputation is a typical force in society. Businesses might want to do malicious things but the fear of destroying their identity helps to keep them in line.
Personally I trust something more when someone puts their reputation on the line for it.
Reputation is also a force in onion land. There are hidden services that are trusted and those that aren't.
You see it with the drug marketplaces - users self-organize a distributed reputation system that tells other users which sites can be trusted with bitcoin and drug transactions and which cannot.
It's a remarkably well functioning and resilient system - and its hosted across reddit, forums, blogs, etc.
My company (details in profile) relies on Tor. We help people, but governments would like to shut us down. We actually will expose everything via a non .onion, then proxy it to our HS (SSL term done on the hidden service). We'll expose via .onion as well and get a cert. We are hoping with HS v3 the rules will relax. For now, Digicert told us they were not going to issue any more. Maybe if we push.
You can have cryptographically anchored pseudonymous trust, which is what Tor does. In many ways it’s stronger than the kinds of trust you run into normally.
You still have to get a correct onion address from a reputable source otherwise how do you know that you're going to the right onion and not a malicious mirror or something?
So somehow there is a chain of trust, not too unlike the traditional PKI model.
Couldn't we put the onion address on the site, signed by the site/publishers key? Ignoring key exchange methods for a moment, once you have a key for "Bob the Blogger" you can verify Bob says the address is X and your url is X, so this is Bob's site.
Well, yes, obviously, in order to know something you have to know it!?
What I don't understand is what any of that has to do with any sort of "correctness"?!
How do you find out the correct name of James Miller? How do you find out the correct domain of google.com? That just seems like a set of nonsensical questions to me.
No one is arguing that onion links are impervious to being lied about. But domain names don't have that property either. In fact, it's somewhat easier to compromise many domain names.
With the new protocol you could use it to protect IOT services, which identity you would naturally know but no one else would. A security camera for example.
An other use case would be access to a server IPMI/Online KVM switch, which has a tendency to not get updated regularly.
In general any situation where you in the past had to go through a third-party hub could be replaced with a hidden service for a end-to-end solution, as long latency isn't a major requirement.
Does it fix the LONG standing issue that Dr. Krawetz keeps discussing on his blog that makes it trivial to DOS an onion site? Description in the section about ‘Eddie’.
He didn't discover an issue - he just FUD'd long enough until a few people believed he did.
That he believes there is something "suspicious" about Tor nodes because their IP doesn't match to any country or AS name in the free distribution of the GeoIP database built into Tor is just smh bad
Why he also believes bots accessing his HS has anything to do with Tor Exit nodes is also beyond me .. but he provides no evidence for any of that either
No the new design does nothing against such DoS attacks. Please see tor tickets #16052 and #16059 for some plausible ways forward. We would need more help from the community to design and develop solutions to handle the DoS threat.
I've tried to communicate with the hackerfactor person about this and figure out ways forward, but I received unproductive responses which did not make me want to proceed in discussion.
We need the community to help us understand how they best want to manage the DoS risks and build sustainable proposals to fight them.
You start a container that runs just tor with a config and can read the routing endpoints from your config, or link to localhost:2375
HiddenServicePort <onion port> <host>:<port>
You setup HiddenServiceAuthorizeClient with stealth auth type and a list of authorized clients.
You can lock your firewall rules down as the hidden service only requires outbound to HTTPS.
On the client end you setup regular Tor with HidServAuth <onion address> <auth-cookie>
With stealth auth other tor users won't see the serivce and port published without the auth cookie
You can then use socat to bind the remote hidden service and port to a local host and port:
socat tcp-l:127.0.0.1:2023,fork socks4a:onionaddr.onion:localhost:23,proxyport=9050
You then have the remote ssh server available locally with no public interfaces, no public ports, and an additional layer of confidentiality and authentication