I generate a link X to do Y on application Z. I post link X on a public website. Crawlers index the public site. When I search Y application Z, the page containing link X is returned in the results.
What action is the author expecting? For all search engines to recognize and ignore these URLs? What about malicious actors? Ask them nicely to ignore?
In a world where Facebook and Google have widespread vulns exposed on at minimum a yearly basis, that's their fault for not reading, and just "trusting the computer".
I'm not a chef, but I make sure I know where my food's coming from to the best of my ability.