Hacker News new | past | comments | ask | show | jobs | submit login

> The current players can whitelist/attest their own clients while categorizing every other scraping clients as bots.

Can't they already do this by having scrapers send plain-old client certificates? Or even just a request header that contains an HMAC of the URL with a shared secret?

Actually, taking a step further back: why does anyone need to scrape their own properties? They can make up an arbitrary backchannel to access that data — just like the one Google uses to populate YouTube results into SERPs. No need to provide a usefully-scrapeable website at all.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: