Hacker Newsnew | comments | ask | jobs | submitlogin
thristian 877 days ago | link | parent

The downside is maintainability. If your website follows the rules, you can be pretty confident that any weird behaviour you see is a problem with the browser (which is additional context you can use when googling for a solution). If your website requires browsers to quietly patch it into a working state, you have no guarantees that they'll all do it the same way and you'll probably spend a bunch of time working around the differing behaviour.

Obviously, that's not a problem if you already know exactly how different browsers will treat your code, or you're using parsing errors so elemental that they must be patched up identically for the page to work. For example, on the Google homepage, they don't escape ampersands that appear in URLs (like href="http://example.com/?foo=bar&baz=qux — the & should be &). That's a syntax error, but one that maybe 80% of the web commits, so any browser that couldn't handle it wouldn't be very useful.



yuhong 877 days ago | link

Particularly before HTML5.

-----




Lists | RSS | Bookmarklet | Guidelines | FAQ | DMCA | News News | Feature Requests | Bugs | Y Combinator | Apply | Library

Search: