Hacker News new | comments | ask | show | jobs | submit login
Popular dark-web hosting provider hacked, 6,500 sites down (zdnet.com)
229 points by wglb 87 days ago | hide | past | web | favorite | 152 comments



I am confused: is this seriously saying that over 30% of "hidden" services were being hosted on the server of one guy named Daniel?... that in a world where the entire point is that you don't know where anything is hosted and you are using tons of indirection through Tor to ensure there is no obvious place to hook all of the traffic or even see packets for timing attacks, it turns out there was a one in three chance that the traffic was being hosted by this one guy named Daniel?


No. That would be like someone taking GeoCities down in 1998 and claiming you took down 30% of "The Internet. You would have gotten a bunch of stuff, but obviously not 30%

At best they took down 30% of hidden web services with published addresses at aggregator sites like Hidden Wiki.

It's not that you don't know where its hosted, it's that you don't know who is using it or where they are. That includes publishers with hosted content, even from the host itself. 5Eyes couldn't just drop a tap in front of Danial's Host and see anything useful, just intermediate nodes with no idea what was on the other end. (barring a >51% attack where they own the first and last nodes in the circuit.)


Your second paragraph is not, in my understanding, the goal of a hidden service: that is merely the goal of Tor itself and would apply to a non-hidden service being accessed via Tor. The goal of the hidden services feature is to allow hosts to have the same level of anonymity as users, making it nearly impossible to shut them down or know where to tap their traffic (for timing attacks).


Onion services can have various goals. Hiding the server is a very common one, but it's not always the case.

For example facebook runs an onion server for their service, they don't need to hide the service itself. So they configure their Tor relays with no anonymity on the service side (HiddenServiceSingleHopMode 1) and get better performance.

Such non-anonymous onion services can have many goals, for example: * Reducing load on Tor exit nodes * Providing users a secure, authenticated connection without depending on the CA system (assuming you got the URL through a secure channel the first time, you know only the key holder can provide service on that host). * NAT traversal for services that otherwise have no need for anonymity


> At best they took down 30% of hidden web services with published addresses at aggregator sites like Hidden Wiki

30% is wrong, but there are other ways these metrics [0] are extrapolated [1]. You can see a dip in that first chart, didn't check dates to see if related.

0 - https://metrics.torproject.org/hidserv-dir-onions-seen.html 1 - https://blog.torproject.org/some-statistics-about-onions


Don't confuse, and don't mix .onion domains or hidden services with web-sites running on a hidden service. If a hidden service exist, it don't have to be a website!

Here is your Data about .onion websites: https://www.reddit.com/r/onions/comments/9yfwfb/my_personal_...

About 7000 .onion sites worked before, about 3100 after...


In 1998, I think 60% of search results were for GeoCities....


That 30% is clearly a gross estimation, and certainly a wrong one with no data to back it up. Still, even though it is definitely counter productive to host on a widely known host, it still does not expose any users, and that is the main point, not really protecting the host.


Here is the Data: https://www.reddit.com/r/onions/comments/9yfwfb/my_personal_...

About 7000 .onion sites worked before, about 3100 after...


Exactly, all they have is some questionable data about onion links. Onion links is one way of using deep web, one of many.


Genuinly curious: if you don't protect the host, how can you protect the users? Host has the power to change any content, including BTC addresses, text,...


Not if the content is cryptographically signed by the author, so at least some content can be served safely. I'm not sure if it's possible to have a dynamically generated website that encrypts or signs all content without having the signing key accessible to the host, though.


In this case: protecting the users == hiding their origin


The hoster certainly has the power to scam people, but he doesn't have the power to deanonymise them.


He does, if he drops phone-home malware. As the FBI has done, at least twice.


That in turn only works where the users havent taken additional steps beyond Tor. Tor was never meant to solve every problem.


doesn't surprise me that the initial layer of obscurity provided by TOR gave people a false sense of security.


If you want to serve some content that's illegal and someone a) offers you hosting, and b) doesn't know who you are, then you absolutely host there because it's zero risk to you.


If you saw the state of hidden web, you would agree.

There are very few hidden services and most of them are... questionable.

The article cites child porn. That's about right.


Either the HN link has changed or the article has been edited but the word porn is not found in the article.


Weird, you're right.

It definitely referenced it when I first read the article.

I wonder what the motivation would be for removing that reference.

EDIT: Note that I am not user "runn1ng"; I am corroborating what they referenced about the contents of the article

EDIT2: They even mentioned three-or-so specific forums or websites, or something.


They changed the link. It was pointing to another one that cited this one as the source and added -even more- speculation on top of it.


Here you go https://nakedsecurity.sophos.com/2018/11/21/hacker-erases-65...

This adds speculation and sensationalism on top of an ill informed article. They even defined it as a 0-day vulnerability, when it is obviously not the case as explained by the article itself. Can't simply trust the quality of the information in there.


Man, why do people insist on using others to host their tor hidden services? It seems like the last thing you'd want or need to do. It's super simple to set up a hidden service from your home computer and host. I've been putting all my clear web sites on tor as well for years. Lots of bot traffic but never any problems and plenty of real traffic too.


The same reason people host their emails, websites, photos somewhere else or host their infrastructure in the cloud instead of colocating. It’s not hard but just another thing you have to care for when your real focus should be on what you provide not how you provide it.


If secrecy/anonymity is important enough to put something on the dark web, using the same assumptions as you might on the public internet is silly.


People are silly. We look for silver bullets all the time. Political figureheads who look and sound good while lacking any depth, moustache-twirling villains to blame the bad times on, silly rituals which start because they accidentally coincided with good times, investment bubbles, and hubris followed by victim blaming.

There’s nothing which would cause an otherwise competent business leader to even realise they don’t understand the limits of any given security system, never mind knowing who to ask for advice.


I mean you should have some focus on how you provide your service. By all means, take advantage of services that make your life easier, but you should at least know how they work on a high level should they ever go down and you need to take more ownership of what you provide.


If you have something that you want to be hidden you most likely don't want it to be found in your apartment.


Maybe because they lack the expertise to do it right?

Especially, they arguably lack the expertise to do remote hosting right. Also, as spurgu says, because they don't want stuff hosted locally. And if they thought it through, they wouldn't want the traffic back to their location.

Given all that, they arguably figure that these shared-hosting sites must know what they're doing.


How do you anonymously host a service from your home computer? I'm new to all this.

I also checked out your site. I hope to be at your level of expertise one day.


Onion (née hidden) services are easily hosted from any computer, even behind a NAT/ISP due to how it publishes its descriptor and relies on reachable relays for rendezvous. It's easy with the Tor executable and a torrc file, there are plenty of guides out there.

Even though you're new to all this, for others wanting to do this programmatically, there is Stem for Python and I've written one for Go [0]. It's such an easy self-hosting NAT traversal technique, I'm surprised it's not used more often in situations not requiring great bandwidth/latency (e.g. p2p chat).

0 - https://github.com/cretz/bine


Inexpertly hosting onion services on your own premises, particularly if you're doing so out of a real fear of getting found out, is not recommended.

Talk to the Dread Pirate Roberts next time he's in the neighbourhood.


OnionShare is a good choice.[0]

However, Tor is vulnerable to traffic analysis. And running a server, adversaries can easily modulate/fingerprint the traffic, which facilitates traffic analysis. If you can see the signal, and have taps on major AS, you can drill down to the server.

0) https://onionshare.org/


A way I would pinpoint a hidden server would be to monitor different data center regions / networks for outages or congestion, while at the same time looking at the hidden server's ping reply. The response times alone can reveal a lot of information. Once I have the data center, I'm sure the data center admins can see which servers are using Tor, and help with pinpointing it further. No need for taps.


Yeah, that too :(

One can route Tor traffic for .onion servers through VPNs, or even through nested VPN chains. That makes it a little harder, because the hosting provider can't easily tell that it's Tor traffic. Also, one can run a private obfsproxy, which isn't listed or indexed by Tor.


"Tor Hidden Services" are services hosted on the tor network and inaccessible outside it.

Hosts are assigned a dns name <id>.onion so clients can connect to that service.


OK, I'm being picky, I know. But the .onion hostname has no relation to DNS. It's just the hash of the site's private key, truncated to 16 characters.


To be more picky, it's the SHA-1 hash of the public key, truncated to 10 bytes, then base 32'd which makes 16 chars. That's only for v2, v3 is a bit different.


Damn, thanks :)

I'd forgotten that :(

And yes, v3 is a huge space. That is, several orders of magnitude larger than the IPv6 space. Which is itself humongous.

Edit: Oops. Got that very wrong. Onion v3 is orders of magnitude greater than IPv6 /64. But orders of magnitude less than IPv6 overall. It's like this, I think.

onion v2: ~1.84×10^19 [16^16]

IPv6 /64: ~1.84×10^19 (which is why OnionCat works)

onion v3: ~9.35×10^27 [56^16]

all IPv6: ~3.40×10^38


For anyone curious, v3 is the 32-byte ed25519 pub key, then the first two bytes of the SHA3-256 hash of the key (w/ prepended fixed string and one byte appended), then one more const byte, and then all of those 35 bytes are base 32'd to make 56 chars. Some Go code to illustrate: https://github.com/cretz/bine/blob/f33268f0843a1b2b131a4cacf.... One benefit being that the entire pub key is right there in the hostname.


Absolutely fine to be!

I actually gave it a quick thought that I was curious how the hostnames were assigned but posted right before bed.


Hey :)

Many would love having DNS for .onion addresses. And there's been much talk of a .onion domain.


Is it not fairly trivial to link that to you? How can you make it anonymous?


In normal Tor usage, the client sends a request through three chosen hops, each of which only knows the previous and next hops, so the entry node doesn't know the destination and the exit node doesn't know where the request originated. But this only hides the client, because the client needs to know the server's address to direct the exit node where to send the request. So to hide the server, there is a symmetrical setup with three extra hops on the server's side, and a published "rendezvous" address in the middle. So the server connects to the rendezvous without revealing its real IP, and the client can direct requests to the rendezvous without knowing the server's IP.


^ is the most informative comment in the whole thread.

So by 'hosting' they mean being the rendezvous address?


If I'm reading this right, it actually hosted the websites but its IP address was hidden. I don't know if each site maybe got its own IP on the host, because after all I'd think it would be trivial for a customer to upload some code to unmask the host's actual IP. https://web.archive.org/web/20170830191551/https://hosting.d...


If the .onion host is competent, they use iptables to restrict output to the Tor process.


No, by "hosting" they mean shared hosting, with multiple .onion sites on one physical server.

Each of those .onion sites would have its own Tor entry guard relays, and would negotiate its own rendezvous points. An .onion service, just like a Tor user, selects a few entry guards that it uses consistently. And gradually replaces with new ones, over some weeks. But rendezvous points get picked fresh for each client-server connection.


Only if you're omniscient in the network. No single node knows what traffic it's carrying, or that the next node the traffic is meant to be sent to is the final one.


"Web services" are hosted on the internet and inaccessible outside it.

"Chat lines" are hosted on telephone networks and inaccessible outside it.


If you're hosting illegal content, traffic analysis could be used to catch you.


Only illegal content? Of course the legality of the content could vary quite a lot, a Falun Gong[1] site would be 100% legal in Canada, and 100% illegal in the PRC.

1: https://en.wikipedia.org/wiki/Falun_Gong#Persecution


Isn't it extremely obvious that getting someone else to hold the bag for you reduces your own risk?


Security has many different parts and objective, and it depend on the exact use case. If someone is hosting information on a public website they likely do not care about confidentiality, and they can already have a setup for backups. In that case all they might want from a host service is uptime and by using tor they get .

Naturally there are still risk with using a hosting service even when all information is intended to be public. The owner might change the content (integrity of the data), and it might be removed (denial of service), which is trade off for the convenience of not having to host it yourself and the uptime of 24/7 servers.


It’s not super simple to safely and securely host a hidden service. Not at all.


It's about as simple to host a website on tor as it is on the clear web. As a benefit you actually own your domain name instead of lease it on the whim of some third party subject to external countries laws and political/social pressures.

Plus it has built in side-effect anti-DOS properties so no need to centralize through companies like cloudflare.

There's huge benefits to hosting on tor even if it's just a regular website. None of my websites hosted on tor are illegal in my country.


Why not? Just keep a backup like any sane individual.


Yes, site owners should have maintained backups.

But even with backups, the .onion private key has been compromised, so you can't come back with the same .onion address.


If I were going to attempt such an .onion hosting setup, I'd use a couple levels of isolation between users. Maybe setup several KVM domains, to help limit damage from a compromise. And within each domain, put each website in a Docker container. Given a custom Docker-optimized kernel for the host, and XFS storage, I gather that it's possible to set hard limits on CPU, RAM and storage for containers.

Given that Docker containers rely on kernel namespaces and cgroups for isolation, they're not as secure as using full VMs. But they're far lighter, and much better isolated than these late .onions. Alternatively, one could maybe use FreeBSD jails with Docker.

And about backups. I get his argument for not backing up. But maybe a setup with relatively fast rotation, and thorough deletion of old backups, would be secure enough. I'd use LUKS with dropbear for server FDE. That's still vulnerable, sure, but attackers would need to take some care while impounding the server. Also, I'd keep backups on another server, with only .onion-.onion connections (maybe OnionCat).


Containers are not at all a security thing. If you really need that caliber of separation, you are looking at VMs at the very least.


Really "not at all"? Isn't that more secure than chroot?

I do agree with you that VMs, at least, would be more secure. But hosting thousands of VMs takes substantial resources. And when there's free shared hosting available, few would understand enough to pay the premium.


What's stopping you from writing this then?


Writing? Do you mean implementing? Basically, the prospect of managing thousands of accounts doesn't appeal to me.


> Winzen said his priority was to do a full analysis of the log files

I wouldn't have thought a darkweb hosting service should have any logs?


It's a server just as any server, instead of serving content through apache or nginx, you are more likely to serve it through thttpd or savant.


Why do you say that? Genuinely curious.

Though I'm not familiar with thttpd or savant, after briefly looking them over they appear to be http servers just like apache or nginx.

What would make them more appealing for a dark web host? They dont seem to be particular "dark-web-centric" with what i could read at face value. though most times dark web stuff has tons of other info thats not found 'at face value'...


"Though I'm not familiar with thttpd or savant, after briefly looking them over they appear to be http servers just like apache or nginx."

Not your parent, but it wouldn't surprise me to learn that "dark web sites" are using thttpd ... it's a very simple, lightweight, dependable web server. The major downside - the lack of SSL - is perhaps not an issue as you are running over an encrypted channel anyway.

If I just needed to throw something up - perhaps on a remote or throwaway host - thttpd would certainly be my first choice.


Also, thttpd[0] is fast, doesn't fork, and is resistant to DoS attacks. The downside is that it's no longer in many repositories, and it can be a pain to compile.

0) https://acme.com/software/thttpd/


It wouldn't surprise me if this was limited to system logs and simple request logs. All the source IPs coming from Tor gateways are going to be from localhost. By default in Apache and Nginx logs also include the user agent but diligent dark web operators disable that.

System logins, auditd, supporting services logs, etc all may provide clues as to what happened.


I don't get Daniel Hosting... Why did he offer free hosting? What was the business model? He said, you are not allowed to host illegal content according to German law, i wonder how he managed to maintain this state with 6.5k pages up. In the end it would be quite easy to raid him for some CP, quite a risk for what gain?


Because why not? Not everything is a business and if you believe Tor as a right of privacy in the digital age and you wish to make it available to everyone, why not?

Censorship comes in many forms, and technological literacy in the form of running a LAMP Stack and knowing how to modify apache is slim in the world, php/mysql, forget about it.

6.5K pages is not a lot of data, especially for interactive and probably non-rich dynamic pages. Consider it as a geocities in the 90's.

What probably happen was there was an jail escape that enabled a live shell and that person killed it. From who knows whom -

As for running a tor site, the most suspicious thing about it is being a tor exit node.


Not everything is a business but many other hobbies have a way lower probability to put you in jail.


"Backups? Forget it. This is the Dark Web. Winzen told ZDNet that there ain’t no such thing as backups on Daniel’s Hosting, by design:"

I'm wondering why encrypted backup is not an option for them?


For the same reason why the NSA siphons up all data from the net, encrypted or not, and stores it. They know one day they'll be able to crack todays crypt. Being able to decrypt a backup 5 or 10 years from now will still provide a lot of useful data, whether it came from the darknet or someones icloud backup while it was in transit to Apples servers. Encryption really only protects you from "today", not what will happen to it in the future. It's one reason I kind of chuckle to myself when people put such high regard into companies that proclaim to be more self conscious about protecting user data and what they do with it. It doesn't really matter if the data is still being collected and stored for later decryption. It's basically just a "delay" of when it can be read. The only thing stopping them, currently, is not having enough qbits in a quantum computer. Things are going to get, strange. https://www.youtube.com/watch?v=vNV_3PkA9WM


Quantum cryptanalysis is no help with symmetric ciphers. All the data encrypted "at rest" via symmetric ciphers, which is what companies are usually bragging about, isn't affected.

Now, I'm sure there's a ton of data that the NSA will be able to break with quantum computing, because people often do transmit data by encrypting it with an asymmetric cipher, e.g. PGP. But... there's already work on quantum-proof asymmetric ciphers.

And TLS uses RSA, so maybe they can crack all the https traffic? Well, assuming today's algorithms are configured correctly, they employ perfect forward secrecy. That means the key exchange algorithms never transmit or store the encryption key, rather an emphemeral encryption key is recreated by both sides. Of course, even with PFS they do get all the metadata and certs when they crack RSA.


Quantum cryptanalysis is no help. It doesn't exist, probably never will and people should stop pretending like it is inevitable to crank their citation count up.


Just xor bit by bit against real random noise,embed it in a massive stream of other pure random, remember the offset, if it's really that important..


Don't roll your own crypto. If that was much better encryption, everyone would be using it.


I think we all agree rolling your own crypto is dangerous, but what siliconunit described is just a one time pad. Assuming your key data is truly random and unknown to your attacker, isn't this kind of the gold standard for uncrackable cyphered communication?


If interpreted that way, storing the noise and memorizing the offset, it amounts to having a privately stored one-time pad to be used as key... with a passphrase with as many bits in it as the offset. That’s probably not a lot of bits. You are better off storing the data, encrypted with a real pass phrase, wherever you would have stored the random noise stream that is needed to read the data in any case. Don’t roll your own crypto.


Using a one time pad isn't "rolling your own crypto".

It is also provably secure unlike every other cryptosystem.


Sure, except now you have an equal amount of data that needs to be stored somewhere. The pad is as long as the data. Either you have to encrypt that using some other means, or you do the weak "store offset" method.


In theory, but getting that stream of "real random" and above all distributing that one time pad securely to the recipient (chicken, meet egg) so they can read the message make this highly impractical.


dont need the random really. just need an offline method for exchanging the pad :D or just use an unexpected channel like radio or so to send it


I agree with not rolling your own crypto for your primary mechanism, layering it (in the correct order!) is guaranteed to be at least as secure as your strongest crypto.

That's what google did with CECPQ1, they use "new hope" which is a quantum resistant algorithm with traditional methods (X25519). That way if new hope is cracked, they are still using an industry standard you'd have to crack as well.


the only proven secure crypto is the one-time pad. rest is all junk relying on computation times to stay 'secure'. and this onee time pad you can never send over a wire to anyone because of chicken-egg problem. so this would be something you would need to exchange offline for decryption to be possible.

if you are not doing that, best stick with known good encryption schemes.

that being said you can implement a lot of encoding schemes like you say in ways that make it arbirarily hard for people to decide what is junk and what is data, to make it nearly impossible to crack especially if you say do that xor with random values etc. because if you'd receive a data intercept you have a hard time to rebuild the data from the junk and then decode it


The dark web is designed to be anonymous. Long term persistence of anything that might hold details about the users of the sites are taboo in the extreme.

Since this is a hosting provider for the dark web (and one that advertises they don't do backups, logs, etc), they don't know what the content of the sites or databases involve. Keep backups of that content is an additional liability for their business.

Site owners for all hosting providers, including when you're running on clouds like AWS, should put thought into their backup strategies independently of guarantees from their provider.

For providers like AWS you will likely use their storage options for backups such as S3, but you are taking a risk especially if you're storing in the same region. Everything in this industry is a series of trade offs, you need to be aware of what trade offs you're making at any given time.


>Everything in this industry is a series of trade offs, you need to be aware of what trade offs you're making at any given time.

>loudest 'Amen' in recorded history emanates from my cubicle


If the user isn't the sole owner of the key, encryption is pretty much worthless for such services. At this point users can do the backups themselves.



Sanitize your inputs...


It wasn't just inputs, it's not enough, weak types. Using == instead of === they could brute force guess a session, it took multiple exploits to escalate till they got shell access.


Background info on the exploit used at: https://antichat.com/threads/463395/#post-4254681



The one the owner thinks was not used as he only saw database access not shell access but was mentioned by ZDNet as getting well known around the time this database was emptied.


(nb: page is in Russian)



>a PHP zero-day vulnerability. Details about this unpatched vulnerability were known for about a month

I find this to be a very upsetting attempt at technical clickbaiting. Feels like a journalist trying to appeal to semitechnical readers with hackerman slang.

If it was known for a month, it's not an 0day.


The article does say that it became widely known only the day before the attack.

Day zero starts counting from the time where either the developers/maintainers of the system or the general public are informed about the vulnerability. If the vulnerability has been discussed in some private forums or exploited by a NSA for many years, it doesn't matter, it's still day zero until it goes public (or privately disclosed to, in this case, maintainers of PHP). And as the attack was something like ~24 hours after disclosure, it's close enough to call it a zero-day attack.


Surely all 0days are known about by someone for some period of time before they actually get used? Or do you think people discover these exploits and then use them on the same day?

My understanding was that 0day just meant that it was unpatched, not "publicly" known and being used for the first time "in the wild".


Huh, doesn't making it publicly known before patching it in PHP constitute a 0day? Or by unpatched did they mean patched upstream but not here?


This is one that people have been getting wrong for decades.

"A zero-day (also known as 0-day) vulnerability is a computer-software vulnerability that is unknown to those who would be interested in mitigating the vulnerability (including the vendor of the target software)" Patching and knowing about the vulnerability are different things.

And so, this is not 0day.


No idea why you are getting downvoted, since this perfectly acceptable explaination of a 0-day.


> there are no backups.

Interesting. If I were into this kind of activity I would love to have backups designed, implemented and maintained the "dark" way. It seems to me that by just running the services you are not 100% into this business. You'd have to come up with the whole "dark ops" approach.


What would you define as a "dark" way.

The "dark" way is to assure that any and all traffic and interactions are anonymous as possible. Keeping a record of communications/behaviors is the direct antithesis of this.

The "dark" way is to never backup


I always thought that communications/behaviors are not a problem once parties remain anonymous or unidentifiable.


I guess that all of the private keys for the onion services will be gone too. Even if they aren't, they can't be used anymore anyway as they should be assumed to be compromised.

I hashed for a month to get a vanity one, others have done longer. Did this hosting service allow custom private key uploads?


Also, if private keys were obtained, it's trivial to republish to the same onion address with any changes you want. This is that one scenario where an EV cert that can be revoked has value if you're a non-anonymous hoster. There are other alternative-factor identity verification techniques (e.g. DNS, Alt-Svc HTTP header) but the querying aspect reduces anonymity.


With the DNS and Alt-Src, are you saying that the Onion site identity can be verified by visiting the non-onion version of the site, and then relying on the header/DNS record to take the session onto Tor?

Then when the onion service private key is breached, the site operator just changes the header and DNS record to a new, non-breached one?

As you say though, querying these does reduce anonymity.


Yup (well, automatically with Alt-Svc when using the Tor browser, some kind of manual TXT record I would guess with DNS, I don't know of anything standardized)


Is it wise to have a vanity private key in this circumstance?


I presume OP didn't mean he hashed until the key was his social security number and name.


No but I assume he chose something that was of meaning to them. You really don't want to turn something that should be anonymous into something that could be traced back to you.


Yes, the first characters are just my name. The hidden service is for my public blog, so it's deliberately non-anonymous.


It will be interesting to see a followup in a few months, in particular, to see how many sites come back online.

My expectation is that a good number will look to an alternate service, though the ones who are willing to try again would be more likely to have recovery plans.


Would it be possible or feasible to compile a site into a single binary? Within binary there would be, say 1GB encrypted file system and embedded SQL lite could write data to it and embedded http process would be able to serve and store static content.

On startup you'd enter password and decrypt a baked-in filesystem.

I guess this would still be possible to break into as long as the app is working since decryption key would be saved in RAM but as soon as server goes offline data would be inaccessible.

Then you host this, say, in Russia or China and you should be safe from USA authorities.


"30% of the dark web was erased." More like "what was loosely estimated to be 30%"


And 30% of accessible addresses not 30% of traffic.


You can't ever get anonymous layer 1. To the point of other commenters - percentage figures are unreliable on tor.


This isn’t true. Wireless mesh networks could make for an anonymous medium.


Well, hopefully the dark web version of google kept a nice cache of what was important


I'm sure the NSA and China has a backup someone could use.


Whats the darkweb version of google?


s/hidden service/onion service

Pedantic, but is the preferred nomenclature today.


So, lots of pages, on one hosting site.


I don't think so. More like one hosting platform, with many (.onion) domains hosted within it.


Give it about...a month and there will be 13,000 onion sites to take its place.


What do you mean? Can you elaborate?


This looks like a good old "Kill the problem and frame him later" type of a take down. The owners (or some users) angered the wrong people, and now they are even the biggest pedos of the whole wide world...


They probably were the biggest pedos in the world. That's the only sane reason to host a platform like this, if you have any sense of self-preservation.

Moreover, its predecessors, Freedom Hosting and Freedom Hosting II, were both the biggest purveyors of CP when they were taken down. If it walks like a duck...


Yeah, and the biggest users of cryptography are criminals. Only the guilty lock their car doors, right?


You and the other commenter are missing the point. I'm not stating either of those.

The motivation for hosting something like this is to hide what you're hosting in a bunch of noise. Legal or illegal. Your goal is to make it hard to gather metadata about the service. There are perfectly legitimate reasons to want that...

That said, it's a foregone conclusion that if you host a service like this, people who want to distribute child pornography are going to flock to it. If you don't police the content, that's even more true.

In fact, in the case of Freedom Hosting II, the host was allowing the pedos to circumvent the normal limitations of the platform, in theory because they were paying cold hard cash, as well as allowing scammers to operate on the platform, also for cold hard cash. It's been suggested that as much as 50% of the content hosted on Freedom Hosting II was child porn.

If you knowingly allow people to use your service to distribute child pornography or commit other crimes, you're every bit as culpable as your customers. This type of platform costs not-insignificant money to run -- it's a business. Nobody is hosting this many services out of a sense of altruism.

And honestly, in the case of CP, if you're hosting it, including running a service that facilitates the hosting of it, you deserve to serve consecutive life sentences in prison and whatever harm befalls those in prison who get found out as pedos.


They were not hosting any CP. That is not proven. As it stands, this is a simple service takedown. Actually, this is speculation added by the sensationalist https://nakedsecurity.sophos.com website, the original source doesn't even mention it.

I mean, they even believe this is a zero day attack, I can't simply trust the quality of any information there.

Edit: they updated the link, here's the old link - https://nakedsecurity.sophos.com/2018/11/21/hacker-erases-65...


Again, if it walks like a duck.

Freedom Hosting was the biggest service and it served CP. Freedom Hosting II was the biggest service and it served CP. Daniel's Hosting was the biggest service -- what do you think it served?

CP is the reason Operation DarkNet et al target these platforms for destruction.


Yeah, and if his skin is dark and he has a hoodie, he's probably a criminal. This "if I think they are guilty it means they are" will never work.

That's the problem with prejudices, the people who have them can never see beyond them.


Sorry, no matter how much you want to defend such behavior, having preconceptions about people who host platforms that are welcoming to pedophiles isn't the same thing as having preconceptions about people based on their race. The former is a prejudice against someone based on their chosen behavior and the latter based on inherent qualities they are born with.

Also, I wasn't aware that "people who engage in or support child exploitation" were a protected class.

You're still trying to put words in my mouth, but by all means keep defending the despicable.


They did not welcome, protect, support nor engage in child pornography.


hosting that very sort of platform welcomes child pornography as is demonstrated by every such sort of platform or service that's come before it and I'm not talking about just the history of Tor hosting here. We used to find the same thing with free hosting, chatrooms and shell accounts in the 90s.

It's already a matter of law (in most western countries) that hosts have some amount of responsibility for what users of their platforms distribute and that's why other hosting services aren't structured this way.


[flagged]


The fact that you have to make a throwaway account to voice such an opinion makes it clear that you know that it is indefensible.

What are your real motives?

See, I don't have to make a throwaway account, and I'll tell you right here that in my youth I had two different pedos in my life attempting to groom me into being victimized. I guess the nature of predation made me a target because of my vulnerability at the time. Luckily I got through that part of my life unhurt, but when I think about it -- just barely.

All those people whose pictures are being looked at did _not_ get away unhurt and those that consume such material deserve to rot.


"Passive consumption" enables the industry that creates this content to exist in the first place. It is not victimless. This is not a difficult concept to understand.

That said, I agree with your point that these people need help and support and that blindly locking them away for looking at pictures on the internet is generally not a good use of resources.


Would you say there's a further point of distinction between someone who just looks and someone who shares it with others?

I'm specifically talking about the latter case, even though I personally see no such distinction.

I will concede your point that rehabilitation is a possibility, but a crime was committed and that help and support can begin in prison.


> that help and support can begin in prison

Could begin in prison. In most countries' prisons, including in the US and most of Europe, it can't.


> Passive consumption is not something I would blindly use as reason to wish someone harm.

Consumption tells the filthy creatures making the material that it is in demand, and they make more. How the hell are you defending pedophilia?!


Only the guilty have something to hide? Really?


Sophos blog spam for actual ZDNet report from over the weekend: https://www.zdnet.com/article/popular-dark-web-hosting-provi...

Maybe a mod can replace the link with the actual source.



  PHP zero-day vulnerability.
Better off offline, honestly.


Why?


My bad. I signed up for my Free Dark Web Scan, and said my name was Philip'); DROP TABLE customers;--


I've done that to sms spammers. I don't know if it worked but I tried ten or so variations. I stopped receiving those messages. I was getting 10 a day at the time. I doubt I actually did any damage but I really hope I did.


Haha! That reminds me of when I once edited my web browser's user agent to something similar, but then I forgot about it. Later, when I was browsing, a lot of the websites just blank screened or returned 500 errors or displayed an SQL error. The other browser worked fine. I finally realized it was my user agent! So yes, a little SQL injection can come up in many unexpected situations as you may have observed :-)

Another time, I set my user agent to an XSS string, just because I was testing something, but forgot to set back. A week later, I noticed that my colleague was baffled by a popup coming up each time he was browsing the logs. Oops, that was my XSS user agent... I'm guessing a lot of places don't sanitize logs.

You'd be amazed how many places these things work :-)




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: