Facebooks robots.txt : https://www.facebook.com/robots.txt
Facebook Has several different rules for different search engines.
But if you go to the end of file. You will find this one rule.
User-agent: *
Disallow: /
The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.
site:facebook.com returns About 3,520,000,000 results in google.
Then my question is why is facebook.com indexed in google and other search engines?
[1] http://www.robotstxt.org/orig.html