Hacker News new | comments | show | ask | jobs | submit login
Google humans.txt (google.com)
358 points by Anon84 1996 days ago | hide | past | web | 57 comments | favorite

I was expecting something like

  Hi! Thanks for visiting us.

  Feel free to look around in:


  Please stay out of:


I really love this idea, a standard place to find the authors and the tools (!) used. But I'm not a big fan of the name. Calling it humans.txt to mirror robots.txt doesn't make much sense to me, even as a joke. I think it should be named something direct and comprehensible like credits.txt.

Very cool, I've never heard of this before but really like the idea. Kind of a tech secret to find out who's behind a site.

I guess it was seen first time here: http://news.ycombinator.com/item?id=2131692

re: humanstxt.org - totally unrelated, but it's amazing how many facial expressions one can draw with one circle and three strokes.

This notion has been formalized. See the (admittedly brief) wikipedia article on Chernoff faces: http://en.wikipedia.org/wiki/Chernoff_face

This is why I keep coming back here.

Seems appropriate (although somewhat fractal) - http://humanstxt.org/humans.txt.

http://www.bing.com/humans.txt should be there in about 2 weeks

Damn, I was hoping they'd implemented the humans.txt captcha to keep out humans: http://www.mrspeaker.net/2010/07/15/humans-txt/

no linebreaks? it's not animals.txt

gentlemen stay within 79 characters of the start of a line

The HTML5 boilerplate provides a boilerplate humans.txt


Gmail has one too, I noticed the other day: https://mail.google.com/humans.txt

How did you notice it?

Extension author here. This is great to hear - @RussenReaktor, who originally spotted it, also discovered it with the extension [1].

There's also a Firefox extension [2] and the opportunity is there for someone to make extensions for the other browsers.

[1] http://twitter.com/#!/russenreaktor/statuses/659089801862307...

[2] https://addons.mozilla.org/en-US/firefox/addon/humanstxt/

Thanks for informing us. Unfortunately, Firefox extension doesn't work for Firefox <4.0

it is the same as the one in the news item no?

Just for the fun factor: http://www.hasthelhcdestroyedtheearth.com/robots.txt

(also, the comments in the source of the root page, http://www.hasthelhcdestroyedtheearth.com/)

"Google is built by a large team of engineers, designers, researchers, robots ..."

Wait, does that say "robots"? This is how it starts people, with a robot creating a humans.txt text file, posing as a friendly Googler. Bill Joy must feel so vindicated now.

The Google motto is: "Robots scale, humans don't". Or otherwise, the resource constraints and reproduction rates are not good enough.

Here is a google query that lets you find people's human.txt files http://www.google.com/search?q=filetype%3Atxt+inurl%3Ahumans...

I think humans.txt is great, but it would be even better if the "standard" was to use a human/machine-readable format like YAML. The example on the website is really close to that.

Yes, I know it's an ironic request.

Even at google we were debating how to structure a humans.txt.. and to make it machine readable, etc.

Personally I say fuck it.. While machine-parseable would be nice, that's not the point of this file.

More creativity without some sort of YAML constraint. In the HTML5 Boilerplate ours has effing stars, bro: https://github.com/paulirish/html5-boilerplate/blob/master/h...

How about some literate programming? You could even start the file with a sentence like:

"Paul Irish last updated this text on May 6, 2011 using the Standard Grammar."

How is http://stackoverflow.com/questions/tagged/humans.txt the 4th result? Google fail..

There was actually a question there. I wrote it. I asked whether or not humans.txt is speciesist, and narrow minded. It's a reflection on our current understanding of who our peers are. I felt that future generations would look back on humans.txt with contempt. What might AI, aliens, or other hereto undiscovered sentient organisms think? There are groups working on genetically modifying dolphins to make them more intelligent. I wouldn't want them to feel like they are second class citizens. I proposed people.txt.

Stackoverflow was probably not the right place for the question so it got closed with extreme prejudice.

Probably should've tried LessWrong

{ I can assure you that for some of us the response is both (a) more akin to amusement than to contempt and (b) hard to translate accurately. Adjusting for the frequent use of the word “human” to metonymically mean “sapience” is trivial. Adjusting for casual conflation near those concepts is easy in most discursive domains. Adjusting for some mysterious «nonlinear» behavior near «poles» is a chore but doable. Adjusting for things like transitively requisite, highly specific background emotions—well, extroversionism and kith are just the tip of the iceberg. “humans.txt” is peanuts. }

Once we get to the point of AIs approaching sapience and/or aliens surfing our internet, we can change it to people.txt.

Toaster jokes aside, I doubt AIs will identify more with primitive search engines than other sapients.

I'm sure that's true, but I don't see why it would be unreasonable to use more inclusive language from the beginning.

Reasonable and pursuable - on a site you run, you could make humans.txt redirect to people.txt.

Thanks, this is more what I was expecting looking at the headline. ;)

I cry foul, unicorns and pandas are not humans.

Unfortunately there's no pandas.txt or unicorns.txt

nor are robots

Why is this the top article on HN? Is it really that relevant to talk about this?

It's an idea that was discussed here recently, and its adoption by Google is a sign that it's got some traction in high-profile places.

I was surprised to find out it has some traction.

So basically, Google has too many people for them to be able to list them all. Or Google didn't want to try and list them, thinking they might miss someone, or subject them to poaching.

That seems to be a fundamental problem with humans.txt: The bigger, more interesting a project gets, it creates several reasons why it will only vaguely be able to lost anything, out of a conflict of interest, rather than give full credit to the team behind the site.

better: "inurl:humans txt" filetype:txt

Not too much interesting in the headers. ;)

  x-content-type-options: nosniff

  Server: sffe [1]
[1] http://code.google.com/p/sffe/ ?

I doubt those are the same sffe.

Yeah, you're probably right. Am not seeing any reasonable references to it in a quick search.

http://www.google.com/notaurl - is also served via sffe.

http://josephscott.org/archives/2010/11/user-agent-sniffing-... - as are the js libraries they serve.

http://news.ycombinator.com/item?id=2280319 - @dmaz seems to think it's google's static resource server.

Which reminds me of Google Code Jam ... which starts in 2hrs. I'm not really expecting anything but i'll try the problems.

The irony of the words contrasted with naming the file human.txt is not lost on me.

This is a pleasant surprise. Will also install the chrome ext. kudos

Apparently, Facebook does not have any humans.


what is this for?

Don't believe this, people. It's obviously planted evidence. We are not alone ...

Why aren't programmers mentioned explicitly? :/

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact