I spotted this earlier this week when ordering a t-shirt through TeeSpring using PayPal. I authorized a payment of 22.95 USD. Here’s a screenshot from the payment confirmation email I received: http://i.imgur.com/BGjKcsW.png The math doesn’t quite add up.
1. What are the differences? I've no idea what the significance of the source is.
2. The one you linked is hella unreadable. OP's table is, due to the color differences, extremely easy to skim, while the one you linked has only very similar black&white x and checkmark symbols, which require closer inspection to differentiate.
I love http://caniuse.com but I wish they had more detail. For instance, on the XHR page, I'd like to know which browsers support addEventHandler syntax and which ones require the use of onreadystatechange.
The point of typing a message in a webmail UI and sending it is to deliver the exact message you entered to the recipient. Adding hard line breaks, effectively altering the original message, is not useful.
Making text fit on a 80-character fixed-width screen is not something the email sender should do; the recipient’s email client should do it based on their preferences.
My exact use case was the following: the user clicks a bookmarklet that passes the current URL in the browser as a query string parameter to a URL shortener script. The validation is then performed before the URL is shortened.
In that scenario, and with the given requirements, I can’t think of a case where the validation fails. There’s no need to worry about protocol-relative URLs, etc.
(Keep in mind that this page is 4 years old — I very well may have missed something.)
> If you really want your URL shortener to reject bad URLs, then you need to actually test fetching each URL (and even then...)
I disagree. http://example.com/ might experience downtime at some point in time, but that doesn’t mean it’s suddenly an invalid URL.
> As an aside, I'd instantly fail any library that validates against a list of known TLDs. That was a bad idea when people were doing it a decade ago. It's completely impractical now.
I still don't quite follow the purpose of the validation. Is it against malicious use? In normal use, I would think that pretty much any URL that's good enough for the browser sending it would be good enough for the link shortener.
> Deviating from the formal spec because everyone practically agrees how to do things, albeit differently than in the formal spec, is something quite different from making shit up, and actually tends to be even harder than building things to spec, as there tends to be no easy reference to look things up in, but instead you might have to look into the guts of existing implementations and talk to people who have built them to figure out what to do - and you would normally start with an implementation according to spec anyhow, and only add special cases for non-normative conventions lateron.