For those on the defending side, remember that the purpose of DNS is to be public. Relying on DNS information to stay secret in order to protect vulnerable application is a bad idea. A service with a SQL injection is unlikely to last long, and as the article shows, it is not hard to enumerate DNS records if you are under a targeted attack.
"... remember that the purpose of DNS is to be public."
DNS records, e.g. zone files, are supposed to be public information.
As such, ICANN requires gTLD registries to provide users with access to them.[FN1]
However, various ccTLDs at various times refused to comply, and most still try to pretend their zone files are private.
Obviously, as the blog post shows, users can still get a large quantity of the data freely from other sources.
Google search results for ccTLD zone files are dominated by individuals who have done so and are selling this data. IMO, this sale of public data is unnecessary, not to mention annoying.
Last year .se and .nu started making their zone files public.
They were also the first to try their hand at DNSSEC. As the blog post shows, DNSSEC makes it easier to get contents of a zone.
FN1. "About Zone File Access
Registry operators must provide to ICANN bulk access to the zone files of the Generic Top Level Domain (gTLD) at least on a daily basis. For gTLDs, a zone file contains information about domain names that are active in that gTLD. In general, Internet users may be able to access and download zone file data at no cost for certain purposes.
This contractual obligation does not apply to any ccTLD (such as .us, .de, or .uk). If you have a complaint about gTLD zone file access, please submit a Zone File Access Complaint Form."
The CommonCrawl dataset is another good source of host and subdomain information. The index server is here: http://index.commoncrawl.org There's a client referenced on that page.
Also you can look at the SSL certificate of a page of the doamin and see if many domains appears inside it.
And currently if the website uses letsencrypt and need wildcard DNS, you may find hundreds of domains. Fortunately Letsencrypt wildcard certificates come in January
If the website is using Let's Encrypt then any subdomains using it will be in certificate transparency logs anyway (until it supports wildcard certificates).
Yes, of course. The idea of certificate transparency logs is to be able to audit all of the certificates issued by a CA to detect mistakenly or maliciously issued certificates.
You can go ahead and search the logs yourself through google.
Symantec presently give you the option to redact the labels on precertificates.
I think they are the only ones because RFC 6962 does not actually support redaction.
Symantec say they are only submitting these entries to the deneb.ws.symantec.com CT log, but it is possible to cross-post redacted log entries to other CT logs, which actually is happening and is kind of annoying.
I think the best way to protect your privacy currently is to purchase wildcards, though LE will support wildcards next year :) !
Kind of related, I also wrote https://ausdomainledger.net/ a couple of weeks ago that is crawling CT logs and Common Crawl as well to enumerate domains in the .au zone, because the zone file is not accessible for the ccTLD.
Ah, thanks! Yeah, currently I'm just waiting for LE's wildcard support, but then it hit me there might be others who don't even submit the 2nd-level domain to CT, which I'd prefer if possible.
Not logging certificates will eventually cause them not to be trusted in Chrome. The deadline was pushed back but it hasn't done away. Firefox doesn't have a firm intent to do the same but it's developing all the technology. So this is probably not a good idea.
Even where a subscriber specifically didn't want their certs logged and the CA doesn't currently log everything if there's ever a problem it'll get logged. This is because both Google and Mozilla will ask to see all the affected certificates, and logging them is the easiest way to satisfy this requirement. If a CA seems to be trying to dodge this usually participants will take whatever form the CA did provide and upload it to the logs. Google's logs don't care who gives them a cert, they can independently determine that it's authentic so they'll log it.
Yeah, but I feel there's at least like a 30%+ chance that their attempt to make this mandatory will stop in its tracks. I just don't see corporations being okay with it.
Mandatory logging is a necessary element of the CT design. What we have today is basically like the trial demo of the game. Full CT requires the completion of "gossip protocols", log auditor third parties which detect inconsistencies and a bunch of other stuff that exists today only in the laboratory, you haven't seen the half of it.
The option left open by the Chrome team was what's called redaction. The logged document would be partly redacted by for example removing host names.
It's not impossible that redaction will happen and end up being accepted, but its biggest proponent was Symantec, who are exiting the business after destroying trust by not exercising proper oversight on their relationship with Korea's CrossCert. And frankly the redaction proposed was getting ludicrous anyway. The trouble is that people who most want this, as with Brass Plate companies, aren't just looking to avoid excess publicity. What they want is to act entirely with impunity. And that's exactly what CT is there to prevent.
There's some work (in the CA/B?) to allow "redacting" certain information from CT logs. I don't recall all the details but I do remember reading an e-mail thread (maybe linked from HN?) about it in just the last week or so.
Why? If you are in a corporation and don't want to show your hosts in the CT logs, don't use CAs then. In a company I don't see a problem with rolling out certificates with an own CA.
If you want external access you can still use your own CA. I think one can assume that an employee (with an employee device) has the CA installed. If this is a public host or you can't control the employee devices, then use a public cert, or live with the error message.
On point 6, find the asn, my own service https://ipinfo.io can help you there. Curl ipinfo.io/ANYIP/org and you'll get the ASN info. Also see ipinfo.io/developers for more options such as geolocation, hostname, hosted domains, and more.
WolframAlpha can display subdomains, not sure how and the format is not easy to use. But it works.
How one would one go about finding A domains pointing to specific AS. The target web app in question allows either using their subdomain or pointing to your own. Any ideas how to find those?
If you're looking for physical site ip addresses rather than dns entries, SPF records may be worth looking at (if there are distributed mail servers and outbound mail doesn't all go through a single exit point).
Mail headers may be similarly informative if you can get one or more email messages out from within the office.
I'm surprised at the number of domains that have mis-configured Zpne Transfer even today especially given the fact the the defaults of most popular DNS servers are secure enough.