Hacker News new | past | comments | ask | show | jobs | submit login

I can see urls eventually being thousands of characters long with referral links daisy chained.



I use Neat URL[0] with Firefox to strip things like that from URLs.

0: https://addons.mozilla.org/en-US/firefox/addon/neat-url/


That's cool, but only really protects the surfer. Instead of stripping them, you could also either fill them with garbage values, or with more effort, swap the parameters with another link. This would degrade the analytics results, but might be harder to detect. And if enough people used it, you'd get some kind of herd tracking immunity.


Do you know if the rewrite happens before a user clicks a link? If it happens after, the data ends up actually being sent out to their server...


Some already are.

A lot of malicious links are just base64'ed to another redirect service; to another base64'ed address (continue as long as your head can keep up.)


I'd expect something like uBlock stripping this off.


If I remember there is a 1024 character limit.


There isn't. HTTP doesn't define a limit but browsers, servers, and other software can. A quick google search turns up multiple places referring to Internet Explorer's limit being 2,083 characters and other browsers' being tens of thousands of characters.

https://stackoverflow.com/questions/15090220/maximum-length-...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: