
Security.txt (2017) - 1nvalid
https://securitytxt.org
======
DyslexicAtheist
I understand and welcomed the initiative when it was first discussed.
Meanwhile I've implemented it on a couple of domains and hat it running for
all of 2018. Last month I removed it all again (the sites still have a
responsible disclosure link but not at a standardized URI).

It was a massive waste of time for me to engage with a group of " _unknowns_ "
-without prior relationship- who now had a channel to fast-track into my inbox
(and got my attention).

The volume of mails was anywhere from 5-20 emails per week which is a lot if
you need to go through the points in every mail but you don't have any idea if
the entity behind the _poor-security-report_ is even skilled. So you start in
good-faith that the report they gave you (e.g. you tell yourself that your own
time analysing it is justified "they're probably competent", "they know what
they're doing otherwise they'd have a different job", "maybe somebody will get
lucky this time", or depending how many bad reports you already read you might
be down to "even a blind chicken finds a kernel of corn every now and then")

it seems I ended up having endless discussions with people who automated the
whole thing: they crawl the web for /.well-known/security.txt URI and if the
find it, automatically start-up metasploit or burp-suite and then send you the
canned report while asking you to fix these "serious problems". Yet if you
quiz any of these "researchers" deeper about individual items in their canned
reports you get nothing but blank stares, incompetence and attempts to weasel
out: _" but burpsuite says that it is an error and you should correct it"_,
...

Initially I went through every email patiently. I tried to engage these guys
on why they strongly felt it were vulns or 0days (LOL). I knew what they
reported was canned and they never questioned the context or the errors
themselves. None of them were people that I would hire or trust to give me
good advise. Advise wasn't always just bad. Often they have a whole consulting
company sitting behind them who not only "fix your problems" but also migrate
your whole stack to Drupal or Wordpress or some such nonsense.

If I look at the other end of the spectrum (infosec-memes and "thought
leaders" on twitter) I get people complaining that some customers just don't
know how bad their security is and they even dare to ignore their reports or
worse (question them on their authority in whether this is a bug or not). The
whole thing is the deaf leading the blind here. I get what security.txt was
trying to improve because there is/was a real issue for people to find a point
of contact. But I do not think security.txt is in any way useful. And it's a
total waste of time and money for small companies and bigger companies alike.
(if you're bigger you get more attention = more reports but that doesn't
improve quality - it only adds more work on your end because now you have
N-people (instead of 1 or 2) discussing these constant "non-problems".

~~~
dev_256
What if it was required to encrypt the message? Do you think the number of
spam would go down?

~~~
DyslexicAtheist
I haven't explicitly tried to enforce encryption, but probably the drive-by
style reports would require extra steps that their automation might not
handle. So probably a good first filter. But then I'm still no wiser since the
ability to use pgp isn't a qualifier regarding knowledge of the engineer or
quality of their report.

It seems that the underlying problem is that those that do good work in this
space don't scan the web to find new customers/leads to pitch their service in
shambolic ways. And the skiddies who want to make a quick buck will outnumber
the good who might accidentally have ended up on your site (because they like
your product etc).

the noise/quality ratio in the whole approach is just too big for this to work
well in practice. I'm still waiting for the recruitment industry to catch up
with the practice and use the security.txt as a sink for people who want to be
added to a list of experts that will be contacted when "the company is ready
to do a full security assessment post-MVP". I realize this would be fraudulent
and I'm not advocating for it - just saying that fake-job offers aren't
uncommon either so this will just be a question of time.

~~~
Kalium
What you're describing is a lot like the bug bounty program I ran for a
previous employer. It was mostly low-effort scans and "reports" templated from
something a big company had made public once. No understanding of if not using
HSTS was actually a vulnerability, just the expectation of burp -> report ->
$$$.

There were a handful of genuinely good contributors, but probably under 10% of
reports.

------
drej
Yes please!

Last year, I discovered a severe security flaw on a couple dozen websites and
the sheer communication of this was super painful. I can't just e-mail someone
at hi@foobar.com, I'd usually have to send an e-mail along the lines of "Can
you give me a contact for your admin/security guy? I have something here and I
can't quite disclose it just now." The response rate was extremely low.

If you just give me a security.txt, at least I know I can disclose something
and I have some level of certainty that the e-mail will be read.

~~~
AndyMcConachie
The people who know about and will use security.txt are probably not going to
be the people who you now have problems communicating with.

I'm willing to bet that 10 years after this is standardized the percentage of
HTTP/S domains hosting security.txt files will be around 1%. And that 1% will
be websites that are mostly already doing the right security things.

------
heyjudy
On one-side: Wouldn't it be simpler and more effective to just have a link to
a public key and email address for security reporting?

And on other other: what's with this domain-specific metadata? If semantic
metadata were the issue, why not solve it more generally with RDF or something
that's a part of a greater, more uniform solution that people actually deploy
and people actually use? All these little extra files and custom formats
creates a hodgepodge of muck.

If anything, downloadable artifacts and contact information especially,
through metalink or RDF, need to be annotated with public keys and/or
cryptographic hashes where available. And what about a standard for contact
phone numbers given a website? The list of metadata idea is endless, hence the
need to complicate the web the least and to do it thoughtfully, in a standard
manner.

It's nice when a website is well-laid out and everything is where you expect,
but having semantic metadata in a mechanically-queryable form is even more
valuable... a data "API" for assembling public information uniformly reduces
work for almost everyone.

~~~
Leace
> On one-side: Wouldn't it be simpler and more effective to just have a link
> to a public key and email address for security reporting?

Well, that's what in a security.txt file basically, contact address and
encryption key.

> And what about a standard for contact phone numbers given a website?

Use tel URI in Contact field.

------
dooglius
Why not just put this information in a "Contact" page, which most businesses
already have on their websites?

~~~
fredley
Because that is a user-facing page, and this is a developer(/hacker?)-facing
page. Anything listed on the contact page, particularly for larger
organisations will get contacts from any slightly annoyed user who hasn't been
able to make contact through other methods.

By putting this in a non-trivially discoverable (yet well-known) location it
should cut down on exactly the kind of noise that makes keeping security
vulnerability reporting channels open and clear hard.

------
sdan
Resubmission from a couple years ago:
[https://news.ycombinator.com/item?id=15416198](https://news.ycombinator.com/item?id=15416198)

Any new or relevant updates?

~~~
Leace
New draft was released on January 12, 2019.

The diff:
[https://tools.ietf.org/rfcdiff?difftype=--hwdiff&url2=draft-...](https://tools.ietf.org/rfcdiff?difftype=--hwdiff&url2=draft-
foudil-securitytxt-05.txt)

------
amaccuish
Great idea, but my servers are constantly being "scanned" for the existence of
security.txt, fills the logs up quite a bit.

------
wycy
Why the /.well-known/ subdirectory? Is this a commonly used directory for web
dev things? From what I recall, items like robots.txt and .htaccess normally
just go in the current directory.

~~~
stedaniels
Is been an RFC for almost a decade, RFC5785 [0].

Mattias Geniar has a good write up on it [1].

[0] [https://tools.ietf.org/html/rfc5785](https://tools.ietf.org/html/rfc5785)

[1] [https://ma.ttias.be/well-known-directory-webservers-aka-
rfc-...](https://ma.ttias.be/well-known-directory-webservers-aka-rfc-5785/)

~~~
jolmg
Neither seems to explain why these files aren't just placed in root, like
robots.txt. What's the point of having a subpath?

The RFC says:

> 2\. Why /.well-known? It's short, descriptive, and according to search
> indices, not widely used.

This gives reasoning for the name of the subpath, but not its existence.

~~~
ezrast
To minimize collisions, and so we're not retreading this argument in another
ten years:
[https://news.ycombinator.com/item?id=19063727](https://news.ycombinator.com/item?id=19063727)

~~~
jolmg
Ok, yeah. It occurs to me that websites might want the whole path to be
variable text, maybe user defined.

This would also answer the question I've had for so long as to why Wikipedia
chose to have /wiki/ before article titles in their URLs. I guess it was so
that the article
[https://en.wikipedia.org/wiki/robots.txt](https://en.wikipedia.org/wiki/robots.txt)
did not collide with their robots.txt file.

Would've been nice if that explanation were included in the RFC.

------
numbsafari
I wonder if anyone has written a successful exploit of the crawlers folks use
to scan this?

------
1nvalid
And the discussion from Troy Hunt...
[https://twitter.com/troyhunt/status/1082890150223302657](https://twitter.com/troyhunt/status/1082890150223302657)

~~~
ChrisGranger
> 8,582 websites in the Alexa Top 1M implementing a security.txt file

I count 6460 sites on the linked list, and _loads_ of these are subdomains on
tumblr.org, so I wonder what the number of actual TLDs using security.txt
is...

------
krmbzds
I have been thinking of a similar idea, but for warrant canaries.

------
m-ueberall
I wonder how long it will take to include additional "imprint" and "data
protection declaration" link fields… #GDPR #EU

------
Santosh83
Do the major browsers support it?

~~~
kijin
This is for humans, not browsers.

------
pmlnr
webmaster@domain.com

No need to overcomplicate this.

.well-known is a terrible idea by the way, a <link rel> or <meta> would make
much more sense.

~~~
the8472
not everything HTTP serves HTML

~~~
bmn__
The `Link` header exists which is the moral equivalent of the `link` element
in HTML. That way you can express link relations on resources whose
representations do not have hyperlinks. Example:

    
    
        HTTP/1.1 200 OK
        Content-Length: 153054
        Content-Type: image/webp
        Link: <http://example.com/mediterranian-2019/olives.webp>;
          rel="next"; title="next image in album"
    

See [https://tools.ietf.org/html/rfc8288](https://tools.ietf.org/html/rfc8288)

------
hartator
Good idea but terrible URL. Web servers by default don’t server hidden
folders. Why not just example.com/security.txt instead of example/.well-
known/security.txt?

~~~
theon144
It's a .well-known way to serve additional metadata for a website.

[https://en.wikipedia.org/wiki/List_of_/.well-
known/_services...](https://en.wikipedia.org/wiki/List_of_/.well-
known/_services_offered_by_webservers)

Better link from stedaniels further in comments: [https://ma.ttias.be/well-
known-directory-webservers-aka-rfc-...](https://ma.ttias.be/well-known-
directory-webservers-aka-rfc-5785/)

