

Show HN: Robots.txt Checker (CoffeeScript, Node) w. Bookmarklet - franze
http://www.franz-enzenhofer.com/robotstxt?robotstxturl=https%3A%2F%2Fwww.google.com%2Frobots.txt&useragent=Mozilla%2F5.0+%28compatible%3B+Googlebot%2F2.1%3B++http%3A%2F%2Fwww.google.com%2Fbot.html+-+fake+-+a+harmless+robots.txt+bookmarklet%29&testurls=%2F

======
alphadown
How about a robot.txt generator. Maybe it shows the user all the directories
of the website (reads the sitemap) and then allows the user to disallow
specific documents and directories in a point and click fashion. The generator
then writes it in the proper syntax.

Yea? Nay?

