Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Doing a reverse IP lookup you can see there are more than 1000 domains hosted in a single IP address (the domain has four IPv4 addresses). Search engines will sometimes lower ranking based on your "neighbours" - perhaps the OP could start using a service such as Cloudflare (which still have multiple domains per IP but would automatically get a bunch of new IPs, so perhaps "start fresh" on that aspect.

Even on the free plan you can create "Firewall rules" (up to five). Create one that just logs the traffic from known bots. Claim the domain in Google Search Console (no script needed on page) and see the stats for that domain and match those with the Cloudflare traffic. This will help identify if there's something going on - are the Bingbot and Googlebot visiting or not at all? What pages are they looking at?

The robots.txt file is complex - repeated paths that shouldn't really be there. Perhaps create a Cloudflare rule to block access to those paths to everyone but the owner on a static IP address?

After this is done then the OP would know a bit more about what's going on.



Good suggestions, thank you.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: