Hacker News new | past | comments | ask | show | jobs | submit login

I would imagine you'd be able to simply submit a domain to DDG, they would ask for a txt record or file to be present within the site, similar to a DNS verification tool. Then it would be queued up in a crawler for removal upon verification. Is there something I'm missing?

If it's individual pages, then probably just a meta-tag?

I think robots.txt could be leveraged for this though maybe.




> Is there something I'm missing?

Yes, they don't have their own crawler for regular websites. They get their organic search results from Bing and Oath.


I assume then that Bing will be responsible for this then (or if they won't then they'll need to find a new engine)




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: