
Ask HN: What should we do if someone is crawling our website? - jajool
A couple days ago I found out someone was crawling our website. she was sending low traffic (150 rpm) and it was not a real problem! (we have around 15k rpm on average).<p>I created an automated service to find crawler IPs and ban them, I did this for fun (parsing stream of requests, finding malicious behavior and blocking using firewall API was a challenging task).<p>Not only this service didn&#x27;t stop her but she is trying harder and her request rate has tripled today (she is using more IPs. today, 1.5k of her IPs were banned).<p>What do you think I should do, let her crawl or chase this rabbit hole?<p>Thanks
======
elmerfud
Is the crawling not respecting your robots file? You don't state a reason for
wanting to block the behavior other than it was fun to build the tooling to
discover it for you.

Personally I don't see a reason to block non abusive things from crawling a
site. Sites are there to be found and read. Indexers, archivers, etc... are
normal things that may provide a non obvious benefit.

This seems to be a judgement call on if you believe the the actions are
nefarious or not.

