Would be nice if this was open sourced so more items could have been added by the community (also framework specific checklists) but I like the concept
One thing I would add which is driving me crazy on mobile / tablet sign up pages
- make sure your email fields are annotated with type="email"
Another common issue is with SSL mixed content waring, so I would also add
- make sure to use protocol relative / https only URLs
(with a reminder to NOT use protocol relative URLs in email templates, your outlook users will appreciate it)
It is open source, but I can't seem to find a license on that page, so it may not be free software. I would be reluctant to modify and redistribute copies of it.
Protocol relative URLs for stylesheets unfortunately cause a double download (one for both HTTP and HTTPS) in IE 8 and below, which is a damn shame. Paul Irish has a lot of info on this here: http://paulirish.com/2010/the-protocol-relative-url/
> Would be nice if this was open sourced so more items could have been added by the community (also framework specific checklists) but I like the concept
Just to share, since you mention framework specific, a similar concept exists since a while ago for the PHP symfony (version 1) framework (not official, but I quite liked it back then)
just got new styling the other day, as I work on updating my seventeen-year-old personal website.
There are still a LOT of websites that make several of those top ten mistakes. They are higher priority than many of the other issues mentioned on the checklist kindly submitted here. As other comments here have pointed out, it's desirable in a checklist to establish priorities.
No. 5 on that list is not valid any longer, fixed font sizes were only an issue with IE6 where fonts specified in pixels wouldn't respond to the user's font size setting.
A no-www domain might not be the best solution if you ever want a 'Cookie-free Domain' (static.) for images etc. which speeds up your site. If you start with a no-www domain you have to setup a different domain (no subdomain) for it: like sstatic.net for SO, ytimg.com for YT and yimg.com for Yahoo.
When the browser makes a request for a static image and sends cookies together with the request, the server doesn't have any use for those cookies. So they only create network traffic for no good reason. You should make sure static components are requested with cookie-free requests. Create a subdomain and host all your static components there.
If your domain is www.example.org, you can host your static components on static.example.org. However, if you've already set cookies on the top-level domain example.org as opposed to www.example.org, then all the requests to static.example.org will include those cookies. In this case, you can buy a whole new domain, host your static components there, and keep this domain cookie-free.
Bear in mind that making sure requests for static content don't send cookies is pretty far down the front-end optimisation ladder - there are normally a lot of things you can do first that are quicker, easier, and have a bigger impact.
Well another thing is that it's easier to setup GEO-ip stuff in a CNAME (so www) instead of the root A records, for now at least in PowerDNS (used by Wikipedia etc.). You're completely right, but if your sites ever scales to something big you're not in the best position with a no-www, in my view having a www record (and no-www redirect) has more benefits that a no-www.
is just harmful. force 'www.' instead. why? shitty URL parsers, marketing people and DDOS attacks, that's why.
let's imagine you write a
- blog post
- blog comment
- press release (distributed via free and paid press release services)
- mail
- word
- forum post
- ...
- ...
if you have a non-www URL it's a game of chance, your in text "whatever.tld" domain will get transformed into a clickable link. yes, a lot of modern URL parses will transform whatever.com into a clickable link, some will even transform whatever.in into a useable link, but a lot of old, shitty, idiotic, strange URL parsers won't. and well, a big part of the web, i would say most of it, is not up to date. so using non WWW will lead to a loss of inlinks and to a poor user experience of users who want to reach your site, but can't click on the in-text-domain (they need to copy/paste instead)
and the situation will get worse with the new commercial TLDs to come.
yes, you can - in most cases - force a domain to link conversion in most CMS if you write http:// in front of it. but well, in a promo text most marketing/pr people will not write "and http://whatever.tld has a new feature to give people endless bliss" they will write "whatever.tld has a new ....".
oh, and by the way. whenever a journalist will write a piece about you, in print or online, they will always (or at least in a lot of cases) write www in front of your domain anyway. yeah, that's not an issue if you have redirects in place, just annoying if you have an non-www webproperty.
plus
having a subdomain is another layer of defenses agains DDOS attacks. see this discussion on hacker news from may 18 2011 (my birthday by the way) http://news.ycombinator.com/item?id=2575266
And yet, I find no-www so much cleaner. With 301s it's generally not a problem, and link parsers will look for the protocol anyway. I think the only valid point is mitigating DDOS attacks, but I don't know enough about that subject to comment.
but in marketing, as in mails and comments, you or your loyal users do not always write http:// in front of your domain.
i consulted a sh-tload of companies on this question (and yes, i also think i have better things to do), any company that chooses non-www URLs regrets it down the road.
I'm sure just about anyone who has used the web for any length of time has hit the standard apache "Not found" page hundreds of times now and pretty much knows what it means.
Custom 404 pages of often quite confusing as they will try to be clever and redirect you to other content that may be interesting. Sometimes these aren't clear and give the impression that the link was not broken and that this is where the site designer intended you to go which leaves you looking around the page for the content you thought you were going to get.
"Custom 404 pages of often quite confusing as they will try to be clever and redirect you to other content that may be interesting."
I agree, however I also believe that is the intent of filing it under "usability". It isn't usability as you would commonly define it, a good UX, but rather keeping the UX of the site consistent across all states, even failure, and giving the user an entry point back in to the rest of the site. A default Apache 404 does not do this, it's just a flat white page, with your only option being to go back from whence you came. If that wasn't your site, then the perception is you've lost a potential visitor, and that potentially could've been avoided with a custom 404 page.
I prefer the 404 pages that something to the effect of "Sorry, that is broken" and then include the results of a site search of the keywords or friendly url that was provided.
A document that bills itself as "The ultimate checklist for all serious web developers" should not hide most of its content (via CSS) and require trusting some unknown author's javascript to display it.
Security is much more dependent on the site itself though, it's not as "general". Do you have forms? Then watch out of SQL injection. Do you have user input of any type? Watch for XSS. Admin login page? Consider HTTPS. Something like a favicon can apply to every site, not so much with security practices. The idea of just having a "security checklist" is a bit worrisome in itself. The developer in charge should be familiar with the potential dangers as they program a feature, it shouldn't be an afterthought from a checklist.
I was thinking about this a couple of days ago. The way I would do it is to submit it individually to each of the checks (e.g. W3C validator) and scrape the results. There may be APIs available for some, I've not looked into that.
This drives me nuts. On the surface there is very little difference between `just functional' and `production ready'. And it's a hard sell if the client is not aware of the benefits.
I have a sublist of that, that I built over the last ~year of hacking on web projects. One of my biggest to dos in each project is automate stuff like validating. I still haven't really found a good way so I either go to w3c and check everything once in a while or I just don't. Usually I just don't.
This to me is like a checklist of things to automate. Is there any "build" system for the web?
I built a public trello board from this list: not quite sure if that's the best presentation (should it be one card for each heading?), but ideally people would clone it to work on their own sites, and make contributions of new cards/info for existing cards on the main board. https://trello.com/b/hkC4B6HA
Is it time for feature requests? It'd be great to get integration with common management tools (e.g. Github Issues, Trello), so that the list can automatically be imported for a given milestone.
Would be nice if this was open sourced so more items could have been added by the community (also framework specific checklists) but I like the concept
One thing I would add which is driving me crazy on mobile / tablet sign up pages
Another common issue is with SSL mixed content waring, so I would also add (with a reminder to NOT use protocol relative URLs in email templates, your outlook users will appreciate it)