
URL query parameters and how laxness creates de facto requirements on the web - todsacerdoti
https://utcc.utoronto.ca/~cks/space/blog/web/DeFactoQueryParameters
======
drewcsillag
Accepting random stuff like this is in the spirit of the protocols of the
internet.

Postel’s Law: > Be conservative in what you do, be liberal in what you accept
from others (often reworded as "Be conservative in what you send, be liberal
in what you accept").

Also known as the robustness principle
[https://en.m.wikipedia.org/wiki/Robustness_principle](https://en.m.wikipedia.org/wiki/Robustness_principle)

------
Tagbert
The article doesn’t seem to identify a problem caused by these unexpected
parameters. I would think any system that accepts input like this would need
to validate the input and reject or ignore invalid input. Where is the
problem?

I have been known to add a parameter like &x=1 to a page that fails to load
properly the first time. It can invalidate an incorrect cache and let the page
reload.

------
jbverschoor
It’s not laziness. It’s the stupid flexibility of certain protocols, APIs,
languages

Make things as strict as possible

