Hacker News new | past | comments | ask | show | jobs | submit login

There's virtually (heh) no way to host a web page that doesn't involve another company's infrastructure (AWS, your ISP, etc), even any number of nodes logging that some packets were forwarded from A to C.

If it's important to you then the best that can be done is find a company you can trust doesn't have profit motive to use this data and use always use HTTPS.

AWS is big enough that there's probably very little profit in analysing CloudFront logs but then again I don't know that.




What if I host my server at data center? Does data center network hardware logs connections? I know that technically it can do that, as it only have to watch for TCP connection start (and, may be, collect statistics for each connection), but are they really doing that?


I'm not saying if they are or not, but I'm arguing from an absolutist point of view. They're probably not, but you could make just as strong an strong argument that GCP and AWS are probably not because at the end of the day we just don't know.

My point being, at some point in the chain you have to trust a corporation is not doing something evil. It's all a calculated trade-off at the end of the day.


Question is, in which aspects you have to actually trust - with no verification or no ability to enforce - the intermediary behavior? How costly would be to avoid having trust necessary?


My opinion as I stated before, is to follow the money trail.

I'd trust Digital Ocean or Rackspace over Azure, for example, because Microsoft does get some money from advertising/tracking.

If you're terminating TLS directly in 'your' cloud VM in Azure/GCP/AWS, I think that's an acceptable boundary as I don't believe these companies would risk accessing customer VMs (since it's not worth the backlash they'd receive in return from being discovered).


If a website was on TOR, it would be virtually untrackable by those other companies, right?

Technically the website is still using the infrastructure, but they wouldn't have details about the end user.

Can we get more "normal" websites on TOR please?


I think you're mixing two different aspects of the Tor project.

On the server side, onion services (running a tor daemon that reverse-proxies request to your HTTP/mail server) provide end-to-end encryption and secure name resolution (transport security).

On the client side, the Tor Browser is a custom Firefox edition that aims to make user fingerprinting impossible among Tor Browser users. From a server's perspective, they are all the same; they also come from the same IP address (the Tor exit nodes) when the service is not an onion.

Enabling an onion service does not prevent 3rd party tracking although it raises the bar. Third-party scripts/cookies can still do plenty of tracking. That's why those (as well as web fonts and SVG) are disabled in Tor Browser's Safer security settings, and javascript is completely disabled in the Safest mode.

So fingerprinting protection really is taken care of on the client side. Neither onion routing nor onion services will help you evade tracking if you use a classic browser.


That's true, but we're not living in a world where everyone uses Tor. I think it'd be pretty cool to see though.


Yeah, pretty cool, if you're into cypherpunky solutions to an extent where you don't mind the accompanying dystopia.

I get why some users need Tor, many of them for agreeable reasons, and that they would benefit from more people using Tor (and having more nodes available). But I think making it the default way of using the internet for everyone would be stupid. If the average connection goes through n hops, the entire infrastructure would have to be multiplied by that factor. Just the environmental impact (in terms of increased energy consumption and hardware production) would be significant. And when Tor is the only effective option you have for escaping unwanted tracking, you're disadvantaged as soon as you want to use a service that doesn't work well with he increased latency.

There has to be a better way. Not that I could say what exactly that would be, but what would be really cool to see is a world in which everyone has worry-free internet access. Multiplying the cost doesn't seem like the best approach.


There is a difference in it being "virtually" no difference, and there being no difference. Allowing a third part to handle TLS for you (as cloudfront/cloudflare and other providers often do) is undermining the point of encryption. For a static site that little difference but encryption herd immunity is still compromised.

Proper encryption is not just for important stuff, it's for everything, so that when we send important stuff it is not seen.


No, no: it was just a "technical" security question, not personally important at all. Thanks for the reply.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: