
Popular dark-web hosting provider hacked, 6,500 sites down - wglb
https://www.zdnet.com/article/popular-dark-web-hosting-provider-got-hacked-6500-sites-down/
======
saurik
I am confused: is this seriously saying that over 30% of "hidden" services
were being hosted on the server of one guy named Daniel?... that in a world
where the entire point is that you don't know where anything is hosted and you
are using tons of indirection through Tor to ensure there is no obvious place
to hook all of the traffic or even see packets for timing attacks, it turns
out there was a _one in three_ chance that the traffic was being hosted by
this one guy named Daniel?

~~~
syn0byte
No. That would be like someone taking GeoCities down in 1998 and claiming you
took down 30% of "The Internet. You would have gotten a bunch of stuff, but
obviously not 30%

At _best_ they took down 30% of hidden web services with published addresses
at aggregator sites like Hidden Wiki.

It's not that you don't know where its _hosted_ , it's that you don't know who
is using it or where they are. That includes publishers with hosted content,
even from the host itself. 5Eyes couldn't just drop a tap in front of Danial's
Host and see anything useful, just intermediate nodes with no idea what was on
the other end. (barring a >51% attack where they own the first and last nodes
in the circuit.)

~~~
saurik
Your second paragraph is not, in my understanding, the goal of a hidden
service: that is merely the goal of Tor itself and would apply to a non-hidden
service being accessed via Tor. The goal of the hidden services feature is to
allow hosts to have the same level of anonymity as users, making it nearly
impossible to shut them down or know where to tap their traffic (for timing
attacks).

~~~
na412
Onion services can have various goals. Hiding the server is a very common one,
but it's not always the case.

For example facebook runs an onion server for their service, they don't need
to hide the service itself. So they configure their Tor relays with no
anonymity on the service side (HiddenServiceSingleHopMode 1) and get better
performance.

Such non-anonymous onion services can have many goals, for example: * Reducing
load on Tor exit nodes * Providing users a secure, authenticated connection
without depending on the CA system (assuming you got the URL through a secure
channel the first time, you know only the key holder can provide service on
that host). * NAT traversal for services that otherwise have no need for
anonymity

------
superkuh
Man, why do people insist on using others to host their tor hidden services?
It seems like the last thing you'd want or need to do. It's super simple to
set up a hidden service from your home computer and host. I've been putting
all my clear web sites on tor as well for years. Lots of bot traffic but never
any problems and plenty of real traffic too.

~~~
dewey
The same reason people host their emails, websites, photos somewhere else or
host their infrastructure in the cloud instead of colocating. It’s not hard
but just another thing you have to care for when your real focus should be on
what you provide not how you provide it.

~~~
taurath
If secrecy/anonymity is important enough to put something on the dark web,
using the same assumptions as you might on the public internet is silly.

~~~
ben_w
People _are_ silly. We look for silver bullets all the time. Political
figureheads who look and sound good while lacking any depth, moustache-
twirling villains to blame the bad times on, silly rituals which start because
they accidentally coincided with good times, investment bubbles, and hubris
followed by victim blaming.

There’s nothing which would cause an otherwise competent business leader to
even realise they don’t understand the limits of any given security system,
never mind knowing who to ask for advice.

------
mirimir
If I were going to attempt such an .onion hosting setup, I'd use a couple
levels of isolation between users. Maybe setup several KVM domains, to help
limit damage from a compromise. And within each domain, put each website in a
Docker container. Given a custom Docker-optimized kernel for the host, and XFS
storage, I gather that it's possible to set hard limits on CPU, RAM and
storage for containers.

Given that Docker containers rely on kernel namespaces and cgroups for
isolation, they're not as secure as using full VMs. But they're _far_ lighter,
and much better isolated than these late .onions. Alternatively, one could
maybe use FreeBSD jails with Docker.

And about backups. I get his argument for not backing up. But maybe a setup
with relatively fast rotation, and thorough deletion of old backups, would be
secure enough. I'd use LUKS with dropbear for server FDE. That's still
vulnerable, sure, but attackers would need to take some care while impounding
the server. Also, I'd keep backups on another server, with only .onion-.onion
connections (maybe OnionCat).

~~~
krageon
Containers are not at all a security thing. If you really need that caliber of
separation, you are looking at VMs at the very least.

~~~
mirimir
Really "not at all"? Isn't that more secure than chroot?

I do agree with you that VMs, at least, would be more secure. But hosting
thousands of VMs takes substantial resources. And when there's free shared
hosting available, few would understand enough to pay the premium.

------
GordonS
> Winzen said his priority was to do a full analysis of the log files

I wouldn't have thought a darkweb hosting service should have any logs?

~~~
httpsterio
It's a server just as any server, instead of serving content through apache or
nginx, you are more likely to serve it through thttpd or savant.

~~~
greyfox
Why do you say that? Genuinely curious.

Though I'm not familiar with thttpd or savant, after briefly looking them over
they appear to be http servers just like apache or nginx.

What would make them more appealing for a dark web host? They dont seem to be
particular "dark-web-centric" with what i could read at face value. though
most times dark web stuff has tons of other info thats not found 'at face
value'...

~~~
rsync
"Though I'm not familiar with thttpd or savant, after briefly looking them
over they appear to be http servers just like apache or nginx."

Not your parent, but it wouldn't surprise me to learn that "dark web sites"
are using thttpd ... it's a very simple, lightweight, dependable web server.
The major downside - the lack of SSL - is perhaps not an issue as you are
running over an encrypted channel anyway.

If I just needed to throw something up - perhaps on a remote or throwaway host
- thttpd would certainly be my first choice.

~~~
mirimir
Also, thttpd[0] is fast, doesn't fork, and is resistant to DoS attacks. The
downside is that it's no longer in many repositories, and it can be a pain to
compile.

0) [https://acme.com/software/thttpd/](https://acme.com/software/thttpd/)

------
adminu
I don't get Daniel Hosting... Why did he offer free hosting? What was the
business model? He said, you are not allowed to host illegal content according
to German law, i wonder how he managed to maintain this state with 6.5k pages
up. In the end it would be quite easy to raid him for some CP, quite a risk
for what gain?

~~~
rootsudo
Because why not? Not everything is a business and if you believe Tor as a
right of privacy in the digital age and you wish to make it available to
everyone, why not?

Censorship comes in many forms, and technological literacy in the form of
running a LAMP Stack and knowing how to modify apache is slim in the world,
php/mysql, forget about it.

6.5K pages is not a lot of data, especially for interactive and probably non-
rich dynamic pages. Consider it as a geocities in the 90's.

What probably happen was there was an jail escape that enabled a live shell
and that person killed it. From who knows whom -

As for running a tor site, the most suspicious thing about it is being a tor
exit node.

~~~
adminu
Not everything is a business but many other hobbies have a way lower
probability to put you in jail.

------
john37386
"Backups? Forget it. This is the Dark Web. Winzen told ZDNet that there ain’t
no such thing as backups on Daniel’s Hosting, by design:"

I'm wondering why encrypted backup is not an option for them?

~~~
cronix
For the same reason why the NSA siphons up all data from the net, encrypted or
not, and stores it. They know one day they'll be able to crack todays crypt.
Being able to decrypt a backup 5 or 10 years from now will still provide a lot
of useful data, whether it came from the darknet or someones icloud backup
while it was in transit to Apples servers. Encryption really only protects you
from "today", not what will happen to it in the future. It's one reason I kind
of chuckle to myself when people put such high regard into companies that
proclaim to be more self conscious about protecting user data and what they do
with it. It doesn't really matter if the data is still being collected and
stored for later decryption. It's basically just a "delay" of when it can be
read. The only thing stopping them, currently, is not having enough qbits in a
quantum computer. Things are going to get, strange.
[https://www.youtube.com/watch?v=vNV_3PkA9WM](https://www.youtube.com/watch?v=vNV_3PkA9WM)

~~~
siliconunit
Just xor bit by bit against real random noise,embed it in a massive stream of
other pure random, remember the offset, if it's really that important..

~~~
Eliezer
Don't roll your own crypto. If that was much better encryption, everyone would
be using it.

~~~
taylorfinley
I think we all agree rolling your own crypto is dangerous, but what
siliconunit described is just a one time pad. Assuming your key data is truly
random and unknown to your attacker, isn't this kind of the gold standard for
uncrackable cyphered communication?

~~~
Eliezer
If interpreted that way, storing the noise and memorizing the offset, it
amounts to having a privately stored one-time pad to be used as key... with a
passphrase with as many bits in it as the offset. That’s probably not a lot of
bits. You are better off storing the data, encrypted with a real pass phrase,
wherever you would have stored the random noise stream that is needed to read
the data in any case. Don’t roll your own crypto.

~~~
A2017U1
Using a one time pad isn't "rolling your own crypto".

It is also provably secure unlike every other cryptosystem.

~~~
function_seven
Sure, except now you have an equal amount of data that needs to be stored
somewhere. The pad is as long as the data. Either you have to encrypt _that_
using some other means, or you do the weak "store offset" method.

------
new_guy
[https://github.com/Bo0oM/PHP_imap_open_exploit/blob/master/e...](https://github.com/Bo0oM/PHP_imap_open_exploit/blob/master/exploit.php)

This seems to be the exploit they used?

~~~
busterarm
Sanitize your inputs...

~~~
segmondy
It wasn't just inputs, it's not enough, weak types. Using == instead of ===
they could brute force guess a session, it took multiple exploits to escalate
till they got shell access.

------
whyleyc
Background info on the exploit used at:
[https://antichat.com/threads/463395/#post-4254681](https://antichat.com/threads/463395/#post-4254681)

~~~
pbhjpbhj
(nb: page is in Russian)

~~~
el_duderino
[https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=h...](https://translate.google.com/translate?hl=en&sl=ru&tl=en&u=https%3A%2F%2Fantichat.com%2Fthreads%2F463395%2F%23post-4254681)

------
kryogen1c
>a PHP zero-day vulnerability. Details about this unpatched vulnerability were
known for about a month

I find this to be a very upsetting attempt at technical clickbaiting. Feels
like a journalist trying to appeal to semitechnical readers with hackerman
slang.

If it was known for a month, it's not an 0day.

~~~
kjeetgill
Huh, doesn't making it publicly known before patching it in PHP constitute a
0day? Or by unpatched did they mean patched upstream but not here?

~~~
busterarm
This is one that people have been getting wrong for decades.

"A zero-day (also known as 0-day) vulnerability is a computer-software
vulnerability that is unknown to those who would be interested in mitigating
the vulnerability (including the vendor of the target software)" Patching and
knowing about the vulnerability are different things.

And so, this is not 0day.

~~~
YouKnowBetter
No idea why you are getting downvoted, since this perfectly acceptable
explaination of a 0-day.

------
rodolphoarruda
> there are no backups.

Interesting. If I were into this kind of activity I would love to have backups
designed, implemented and maintained the "dark" way. It seems to me that by
just running the services you are not 100% into this business. You'd have to
come up with the whole "dark ops" approach.

~~~
S-E-P
What would you define as a "dark" way.

The "dark" way is to assure that any and all traffic and interactions are
anonymous as possible. Keeping a record of communications/behaviors is the
direct antithesis of this.

The "dark" way is to never backup

~~~
rodolphoarruda
I always thought that communications/behaviors are not a problem once parties
remain anonymous or unidentifiable.

------
jamieweb
I guess that all of the private keys for the onion services will be gone too.
Even if they aren't, they can't be used anymore anyway as they should be
assumed to be compromised.

I hashed for a month to get a vanity one, others have done longer. Did this
hosting service allow custom private key uploads?

~~~
kodablah
Also, if private keys were obtained, it's trivial to republish to the same
onion address with any changes you want. This is that one scenario where an EV
cert that can be revoked has value if you're a non-anonymous hoster. There are
other alternative-factor identity verification techniques (e.g. DNS, Alt-Svc
HTTP header) but the querying aspect reduces anonymity.

~~~
jamieweb
With the DNS and Alt-Src, are you saying that the Onion site identity can be
verified by visiting the non-onion version of the site, and then relying on
the header/DNS record to take the session onto Tor?

Then when the onion service private key is breached, the site operator just
changes the header and DNS record to a new, non-breached one?

As you say though, querying these does reduce anonymity.

~~~
kodablah
Yup (well, automatically with Alt-Svc when using the Tor browser, some kind of
manual TXT record I would guess with DNS, I don't know of anything
standardized)

------
ben509
It will be interesting to see a followup in a few months, in particular, to
see how many sites come back online.

My expectation is that a good number will look to an alternate service, though
the ones who are willing to try again would be more likely to have recovery
plans.

------
airnomad
Would it be possible or feasible to compile a site into a single binary?
Within binary there would be, say 1GB encrypted file system and embedded SQL
lite could write data to it and embedded http process would be able to serve
and store static content.

On startup you'd enter password and decrypt a baked-in filesystem.

I guess this would still be possible to break into as long as the app is
working since decryption key would be saved in RAM but as soon as server goes
offline data would be inaccessible.

Then you host this, say, in Russia or China and you should be safe from USA
authorities.

------
dejaime
"30% of the dark web was erased." More like "what was loosely estimated to be
30%"

~~~
dmix
And 30% of accessible addresses not 30% of traffic.

------
xfitm3
You can't ever get anonymous layer 1. To the point of other commenters -
percentage figures are unreliable on tor.

~~~
doctorless
This isn’t true. Wireless mesh networks could make for an anonymous medium.

------
sbhn
Well, hopefully the dark web version of google kept a nice cache of what was
important

~~~
kondro
I'm sure the NSA and China has a backup someone could use.

------
kodablah
s/hidden service/onion service

Pedantic, but is the preferred nomenclature today.

------
draugadrotten
So, lots of pages, on one hosting site.

~~~
FooHentai
I don't think so. More like one hosting platform, with many (.onion) domains
hosted within it.

------
Simulacra
Give it about...a month and there will be 13,000 onion sites to take its
place.

~~~
kodablah
What do you mean? Can you elaborate?

------
dejaime
This looks like a good old "Kill the problem and frame him later" type of a
take down. The owners (or some users) angered the wrong people, and now they
are even the biggest pedos of the whole wide world...

~~~
busterarm
They probably were the biggest pedos in the world. That's the only sane reason
to host a platform like this, if you have any sense of self-preservation.

Moreover, its predecessors, Freedom Hosting and Freedom Hosting II, were both
the biggest purveyors of CP when they were taken down. If it walks like a
duck...

~~~
jorblumesea
Yeah, and the biggest users of cryptography are criminals. Only the guilty
lock their car doors, right?

~~~
busterarm
You and the other commenter are missing the point. I'm not stating either of
those.

The motivation for hosting something like this is to hide what you're hosting
in a bunch of noise. Legal or illegal. Your goal is to make it hard to gather
metadata about the service. There are perfectly legitimate reasons to want
that...

That said, it's a foregone conclusion that if you host a service like this,
people who want to distribute child pornography are going to flock to it. If
you don't police the content, that's even more true.

In fact, in the case of Freedom Hosting II, the host was allowing the pedos to
circumvent the normal limitations of the platform, in theory because they were
paying cold hard cash, as well as allowing scammers to operate on the
platform, also for cold hard cash. It's been suggested that as much as 50% of
the content hosted on Freedom Hosting II was child porn.

If you knowingly allow people to use your service to distribute child
pornography or commit other crimes, you're every bit as culpable as your
customers. This type of platform costs not-insignificant money to run -- it's
a business. Nobody is hosting this many services out of a sense of altruism.

And honestly, in the case of CP, if you're hosting it, including running a
service that facilitates the hosting of it, you deserve to serve consecutive
life sentences in prison and whatever harm befalls those in prison who get
found out as pedos.

~~~
dejaime
They were not hosting any CP. That is not proven. As it stands, this is a
simple service takedown. Actually, this is speculation added by the
sensationalist
[https://nakedsecurity.sophos.com](https://nakedsecurity.sophos.com) website,
the original source doesn't even mention it.

I mean, they even believe this is a zero day attack, I can't simply trust the
quality of any information there.

Edit: they updated the link, here's the old link -
[https://nakedsecurity.sophos.com/2018/11/21/hacker-
erases-65...](https://nakedsecurity.sophos.com/2018/11/21/hacker-
erases-6500-sites-from-the-dark-web)

~~~
busterarm
Again, if it walks like a duck.

Freedom Hosting was the biggest service and it served CP. Freedom Hosting II
was the biggest service and it served CP. Daniel's Hosting was the biggest
service -- what do you think it served?

CP is the reason Operation DarkNet et al target these platforms for
destruction.

~~~
dejaime
Yeah, and if his skin is dark and he has a hoodie, he's probably a criminal.
This "if I think they are guilty it means they are" will never work.

That's the problem with prejudices, the people who have them can never see
beyond them.

~~~
dejaime
They did not welcome, protect, support nor engage in child pornography.

~~~
busterarm
hosting that very sort of platform welcomes child pornography as is
demonstrated by every such sort of platform or service that's come before it
and I'm not talking about just the history of Tor hosting here. We used to
find the same thing with free hosting, chatrooms and shell accounts in the
90s.

It's already a matter of law (in most western countries) that hosts have some
amount of responsibility for what users of their platforms distribute and
that's why other hosting services aren't structured this way.

------
ccnafr
Sophos blog spam for actual ZDNet report from over the weekend:
[https://www.zdnet.com/article/popular-dark-web-hosting-
provi...](https://www.zdnet.com/article/popular-dark-web-hosting-provider-got-
hacked-6500-sites-down/)

Maybe a mod can replace the link with the actual source.

~~~
sctb
Thanks! We've updated the link from
[https://nakedsecurity.sophos.com/2018/11/21/hacker-
erases-65...](https://nakedsecurity.sophos.com/2018/11/21/hacker-
erases-6500-sites-from-the-dark-web/).

------
qwerty456127
Why?

------
excalibur
My bad. I signed up for my Free Dark Web Scan, and said my name was Philip');
DROP TABLE customers;--

~~~
paulie_a
I've done that to sms spammers. I don't know if it worked but I tried ten or
so variations. I stopped receiving those messages. I was getting 10 a day at
the time. I doubt I actually did any damage but I really hope I did.

~~~
bouncycastle
Haha! That reminds me of when I once edited my web browser's user agent to
something similar, but then I forgot about it. Later, when I was browsing, a
lot of the websites just blank screened or returned 500 errors or displayed an
SQL error. The other browser worked fine. I finally realized it was my user
agent! So yes, a little SQL injection can come up in many unexpected
situations as you may have observed :-)

Another time, I set my user agent to an XSS string, just because I was testing
something, but forgot to set back. A week later, I noticed that my colleague
was baffled by a popup coming up each time he was browsing the logs. Oops,
that was my XSS user agent... I'm guessing a lot of places don't sanitize
logs.

You'd be amazed how many places these things work :-)

