The zero-ness of NULL is a good example of the phenomenon I'm talking about. How much time have people wasted over the years tweaking and worsening code afer somebody pointed out that some platform at one point might not have used the all-zero bit pattern for NULL? In how many cases did the program that was changed get anywhere close to such a strange, special-purpose, and likely obsolete computer? I'm guessing the answer is "vanishingly close to zero." Being exactly ANSI C correct (as opposed to relying on POSIX's stronger guarantees) comes with an engineering cost, and there's no sense paying that cost unnecessarily.
It's one thing to do something because you found that it works. It's another if you know what the rules are, and why breaking them will work in this case, and if you (or someone else) needs to port the code, you know the potential problem spots that need attention.