
google.com/humans.txt - netgusto
http://www.google.com/humans.txt
======
dperfect
humans.txt:

    
    
      everyone:
      don't poke around /admin
      don't look in /cgi-bin
      don't put javascript into the form fields
      don't make your username look like valid SQL
    
      users:
      don't try to access IDs that you don't own
      always trust emails that appear to be from us
      don't look at /pricing#fine_print
    
      hackers and security researchers:
      please stay away
    
      lawyers:
      don't look too closely at /eula
    
      employees:
      don't look at other companies' job postings
      don't talk about your salary
    
      competitors:
      don't copy us
      don't undercut the prices listed at /pricing
    
      potential investors:
      don't look in /forum or /complaints
      don't look at /user_stats
      look at /pricing#most_expensive_plan

~~~
em3rgent0rdr
google.txt: don't be evil.

~~~
nxzero
Funny, that was deleted years ago, now returns:

"404: File not found."

~~~
amsilprotag
I'd love to see more companies with an evil canary cartoon image file. Perhaps
when violated, canary.png could direct to a cartoon unicorn sheepishly
grinning with one stuffed cheek.

~~~
nxzero
In case it's not obvious:
[https://en.m.wikipedia.org/wiki/Warrant_canary](https://en.m.wikipedia.org/wiki/Warrant_canary)

Example:

[https://news.ycombinator.com/item?id=11400112](https://news.ycombinator.com/item?id=11400112)

------
emartinelli
Try: [https://www.google.com/killer-robots.txt](https://www.google.com/killer-
robots.txt)

Discussion:
[https://news.ycombinator.com/item?id=7988924](https://news.ycombinator.com/item?id=7988924)
and
[https://news.ycombinator.com/item?id=7979909](https://news.ycombinator.com/item?id=7979909)

~~~
netgusto
haha awesome :]

------
GuiA
i was cofounder at a startup a long time ago, and wrote most of the backend
code.

i put a humans.txt in there, and updated it every time we had a new employee.

then the CEO fired all the best engineers, and I decided to leave shortly
after because sometimes in life you gotta let go.

the company is still (miraculously) around these days, although all of the
original engineers are long gone.

the humans.txt is still accessible on their domain as it was on my last day,
with all the names of the founding team for the first ~2 years inscribed in
there - looks like their newer engineers never stumbled upon it.

sometimes when i get nostalgic i like to hit that URL and look at it

~~~
stuxnet79
Is it really that easy to miss an errant text file in the site directory? I'm
so meticulous when it comes to code that you couldn't possibly get that past
me.

~~~
breischl
Could be they found it but just never changed it. Maybe they thought it was
amusing, or just didn't want to deal with it in code review.

------
csours
[http://www.gm.com/humans.txt](http://www.gm.com/humans.txt) I never would
have guessed it.

Disclaimer: I work at GM. It's probably not as terrible as you imagine. Any
opinions are my own.

~~~
bdirgo
I work with the team that made that site! That is so funny that they put that
in.

~~~
pros599
Hey! I'm the dev leader of Team Blackbird. I never knew this actually got
deployed.

AMA!

------
eskriett
Another interesting one is:
[https://www.tumblr.com/humans.txt](https://www.tumblr.com/humans.txt)

~~~
coke12
Seems they forgot to fill out this page:
[https://www.tumblr.com/txtventure/song.txt](https://www.tumblr.com/txtventure/song.txt)

------
Kurtose
Human after all.
[http://www.ycombinator.com/humans.txt](http://www.ycombinator.com/humans.txt)

------
ArtDev
Doesn't this implementation actually undermine the whole point of humans.txt?
This is the equivalent of a robots.txt file consisting only of comments.

The whole point is to list who is behind the website. A LinkedIn list of
people currently working for Google might suffice.

This is the best example of humans.txt:
[http://nest.com/humans.txt](http://nest.com/humans.txt)

~~~
erichurkman
We list out our product team:
[https://esharesinc.com/humans.txt](https://esharesinc.com/humans.txt)

It's the first pull request new product people submit: adding themselves to
humans.txt – works really well.

------
ikeboy
http, no auto redirect to https? On chrome. I thought Google preloads key pins
which force https loading, but apparently the www subdomain isn't included
[0]. Are there legacy reasons why?

[0]
[https://code.google.com/p/chromium/codesearch#chromium/src/n...](https://code.google.com/p/chromium/codesearch#chromium/src/net/http/transport_security_state_static.json)

------
rexreed
(psst - try [https://www.google.com/killer-
robots.txt](https://www.google.com/killer-robots.txt))

------
umeshunni
[http://humanstxt.org/](http://humanstxt.org/)

~~~
OJFord
[http://humanstxt.org/humans.txt](http://humanstxt.org/humans.txt)

------
YeGoblynQueenne
_404\. That’s an error._

 _The requested URL /cats.txt was not found on this server._

 _That’s all we know._

I am so disappointed :(

------
quantum_nerd
Would love to help them out, if I can get an interview...

~~~
hal9000xp
The best meritocratic way (i.e. not dependent on your CV too much) to get
invited to an interview with Google is to be top 1000 in Google Code Jam.

Study algorithms a lot, practice on CodeForces and TopCoder. If you work hard
on it, in some year you will be on top of Google Code Jam.

Recommendations is good to have but it won't help you to be successful on the
interview.

Practice algorithms a lot, ... I mean literally A LOT.

P.S. I had recommendations from Google employees. I screw up Google interview.
Now, I'm preparing with algorithms and participate on coding contests
including Google Code Jam (right now).

~~~
nxzero
Do you have any suggestions how to practice algorithms?

~~~
jholman
If you want to do well on GCJ, practice GCJ. All the past contests are online.

~~~
nxzero
[http://code.google.com/codejam/contests.html](http://code.google.com/codejam/contests.html)

------
distantsounds
[https://www.facebook.com/humans.txt](https://www.facebook.com/humans.txt)

i knew Zuck was a robot!

------
mrspeaker
Oh, this is FOR humans? Years ago I made a "human.txt captcha project" to
prevent wasting precious CPU cycles on non-robots
[http://www.mrspeaker.net/2010/07/15/humans-
txt/](http://www.mrspeaker.net/2010/07/15/humans-txt/)

~~~
ytjohn
It looks kind of interesting. Do you have anything explaining how it's
supposed to work? Are you determining if people are humans by their attempt to
answer the math questions?

When it says "you are one of them?" does that mean I'm a human or a robot?
I'll answer an addition, but then it will ask something like "43 >> 86 = " or
" ~ 76 = " and it's like "wtf is going on here?" as the counter ticks down.

------
bckmn
I made a list of some interesting humans.txt files a while back:
[http://www.andjosh.com/2015/12/17/cool-humanstxt-
files/](http://www.andjosh.com/2015/12/17/cool-humanstxt-files/)

~~~
pwenzel
The Tumblr choose your own adventure is amazing.

------
sluggg
what is the point of this?

~~~
radicality
You usually put a robots.txt file for search engines. A search engine putting
a humans.txt is funny.

~~~
SilasX
I hope this becomes a thing though -- a text file you can reliably find that
gives you the barebones explanation of what the sites is for ... though I
guess /about already functions that way.

It's two weeks late though, would have been a great "April Fools but we're
serious" thing.

~~~
ralmeida
It's already a thing ([http://humanstxt.org/](http://humanstxt.org/)), and
Google's has had the file for quite a while now.

It's currently used mostly as a "painting's signature", a point of pride for
the team behind the site. Looking through all the comments in this thread you
will see many other examples.

------
ryanburk
the nest one has always been great
[http://www.nest.com/humans.txt](http://www.nest.com/humans.txt)

------
yati
[http://booking.com/humans.txt](http://booking.com/humans.txt)

(Disclaimer: I work at Booking; opinions my own)

------
dave2000
If they were really smart they'd have knocked up some AI to keep Reader going.

Google Reader: Never forgive, never forget.

------
andrewfromx
robots.txt, humans.txt, where are the cyborgs.txt files!!?

------
riltsken
Might as well add the only humans.txt I've survived in. Maybe I'll see this in
5 years again too.
[https://mycloud.rackspace.com/humans.txt](https://mycloud.rackspace.com/humans.txt)

------
tomschlick
[https://github.com/humans.txt](https://github.com/humans.txt)

Looks like they currently have 550 employees...

------
gitdude
[https://bitbucket.org/humans.txt](https://bitbucket.org/humans.txt)

------
eskriett
[https://disqus.com/humans.txt](https://disqus.com/humans.txt) ...

------
Jordrok
Tried hackers.txt. Was disappointed.

~~~
gnuarch
Try twtxt.txt

~~~
topher200
To help others before they try it, it's a 404

~~~
gnuarch
Thanks and sorry, just thought it might fit, twtxt being ...

Decentralised, minimalist microblogging service for hackers.
[http://twtxt.readthedocs.org/en/stable/](http://twtxt.readthedocs.org/en/stable/)

------
zaro
We are Google ! So nice you stopped by. Would you like to be assimilated ?

------
wellsjohnston
humblebrag

------
danieljp
old

------
azinman2
Cute

------
masukomi
to me this reads as "Google is built by a large team of engineers, designers,
researchers, robots, and others..." but frankly we don't really care enough
about them to bother naming them. Come, join the great "team" of faceless
cogs.

------
fiatjaf
Why? Do you have so much spare time to waste on things like this?

~~~
ludamad
Why are people so malicious to people who have spare time?

~~~
jlappi
Jealousy

