Hacker News new | past | comments | ask | show | jobs | submit login
Web Developer Checklist (webdevchecklist.com)
426 points by grantpalin on Jan 7, 2013 | hide | past | web | favorite | 75 comments

Liked the favicon part, very often forgotten...

Would be nice if this was open sourced so more items could have been added by the community (also framework specific checklists) but I like the concept

One thing I would add which is driving me crazy on mobile / tablet sign up pages

  - make sure your email fields are annotated with type="email"
Another common issue is with SSL mixed content waring, so I would also add

  - make sure to use protocol relative / https only URLs 
(with a reminder to NOT use protocol relative URLs in email templates, your outlook users will appreciate it)

> Would be nice if this was open sourced so more items could have been added by the community


It is open source, but I can't seem to find a license on that page, so it may not be free software. I would be reluctant to modify and redistribute copies of it.

An Apache 2.0 license has just been added

favicon is only forgotten by those that never check their logs.

(Which should be part of the checklist, check your friggin' logs instead of assuming you never miss anything.)

Protocol relative URLs for stylesheets unfortunately cause a double download (one for both HTTP and HTTPS) in IE 8 and below, which is a damn shame. Paul Irish has a lot of info on this here: http://paulirish.com/2010/the-protocol-relative-url/

> Would be nice if this was open sourced so more items could have been added by the community (also framework specific checklists) but I like the concept

Just to share, since you mention framework specific, a similar concept exists since a while ago for the PHP symfony (version 1) framework (not official, but I quite liked it back then)


I see that Jakob Nielsen's venerable "Top 10 Mistakes in Web Design" checklist


just got new styling the other day, as I work on updating my seventeen-year-old personal website.

There are still a LOT of websites that make several of those top ten mistakes. They are higher priority than many of the other issues mentioned on the checklist kindly submitted here. As other comments here have pointed out, it's desirable in a checklist to establish priorities.

No. 5 on that list is not valid any longer, fixed font sizes were only an issue with IE6 where fonts specified in pixels wouldn't respond to the user's font size setting.

Nice list, but I think it can be condensed into one:

Use your own site and make sure you don't hate it yourself.

Simple but effective advice.

A no-www domain might not be the best solution if you ever want a 'Cookie-free Domain' (static.) for images etc. which speeds up your site. If you start with a no-www domain you have to setup a different domain (no subdomain) for it: like sstatic.net for SO, ytimg.com for YT and yimg.com for Yahoo.

When the browser makes a request for a static image and sends cookies together with the request, the server doesn't have any use for those cookies. So they only create network traffic for no good reason. You should make sure static components are requested with cookie-free requests. Create a subdomain and host all your static components there.

If your domain is www.example.org, you can host your static components on static.example.org. However, if you've already set cookies on the top-level domain example.org as opposed to www.example.org, then all the requests to static.example.org will include those cookies. In this case, you can buy a whole new domain, host your static components there, and keep this domain cookie-free.


I'm not sure that works well with SPDY; it would prefer the same domain?

I never considered this before. A great tip, thanks

Bear in mind that making sure requests for static content don't send cookies is pretty far down the front-end optimisation ladder - there are normally a lot of things you can do first that are quicker, easier, and have a bigger impact.

Well another thing is that it's easier to setup GEO-ip stuff in a CNAME (so www) instead of the root A records, for now at least in PowerDNS (used by Wikipedia etc.). You're completely right, but if your sites ever scales to something big you're not in the best position with a no-www, in my view having a www record (and no-www redirect) has more benefits that a no-www.

Interestingly enough this website doesn't have:

1) Custom 404 page

2) robots.txt

3) PICS label

4) viewport meta-tag

5) Google Rich Snippets

6) Fails the recommended CSS validator

It has the top items now..

Does it? I typed in http://webdevchecklist.com/robots.txt and it gave me a bog-standard IIS 404 page, not a custom one.

At least it was nice enough to reveal a bunch of server information, right?

Now it redirects. :X

sorry but that

   Remove 'www' subdomain
is just harmful. force 'www.' instead. why? shitty URL parsers, marketing people and DDOS attacks, that's why.

let's imagine you write a

  - blog post
  - blog comment
  - press release (distributed via free and paid press release services)
  - mail
  - word
  - forum post
  - ...
  - ...
if you have a non-www URL it's a game of chance, your in text "whatever.tld" domain will get transformed into a clickable link. yes, a lot of modern URL parses will transform whatever.com into a clickable link, some will even transform whatever.in into a useable link, but a lot of old, shitty, idiotic, strange URL parsers won't. and well, a big part of the web, i would say most of it, is not up to date. so using non WWW will lead to a loss of inlinks and to a poor user experience of users who want to reach your site, but can't click on the in-text-domain (they need to copy/paste instead)

and the situation will get worse with the new commercial TLDs to come.

yes, you can - in most cases - force a domain to link conversion in most CMS if you write http:// in front of it. but well, in a promo text most marketing/pr people will not write "and http://whatever.tld has a new feature to give people endless bliss" they will write "whatever.tld has a new ....".

oh, and by the way. whenever a journalist will write a piece about you, in print or online, they will always (or at least in a lot of cases) write www in front of your domain anyway. yeah, that's not an issue if you have redirects in place, just annoying if you have an non-www webproperty.


having a subdomain is another layer of defenses agains DDOS attacks. see this discussion on hacker news from may 18 2011 (my birthday by the way) http://news.ycombinator.com/item?id=2575266

go for www.

And yet, I find no-www so much cleaner. With 301s it's generally not a problem, and link parsers will look for the protocol anyway. I think the only valid point is mitigating DDOS attacks, but I don't know enough about that subject to comment.

but in marketing, as in mails and comments, you or your loyal users do not always write http:// in front of your domain.

i consulted a sh-tload of companies on this question (and yes, i also think i have better things to do), any company that chooses non-www URLs regrets it down the road.

301 redirects.

does solve only the annoyance part (wrong www URLs by journalists), not the shitty URL parser & marketing people and DDOS issues.

Custom 404 page under usability? hmm.

I'm sure just about anyone who has used the web for any length of time has hit the standard apache "Not found" page hundreds of times now and pretty much knows what it means.

Custom 404 pages of often quite confusing as they will try to be clever and redirect you to other content that may be interesting. Sometimes these aren't clear and give the impression that the link was not broken and that this is where the site designer intended you to go which leaves you looking around the page for the content you thought you were going to get.

"Custom 404 pages of often quite confusing as they will try to be clever and redirect you to other content that may be interesting."

I agree, however I also believe that is the intent of filing it under "usability". It isn't usability as you would commonly define it, a good UX, but rather keeping the UX of the site consistent across all states, even failure, and giving the user an entry point back in to the rest of the site. A default Apache 404 does not do this, it's just a flat white page, with your only option being to go back from whence you came. If that wasn't your site, then the perception is you've lost a potential visitor, and that potentially could've been avoided with a custom 404 page.

I prefer the 404 pages that something to the effect of "Sorry, that is broken" and then include the results of a site search of the keywords or friendly url that was provided.

It's less confusing and keeps people on site.

If you do this (which you should in my opinion), please return the 404 code. For example Facebook used to return 200 on error. Very confusing.

Absolutely, I assumed that was a given.

Would be nice to have this automatically generated for a given URL.

A document that bills itself as "The ultimate checklist for all serious web developers" should not hide most of its content (via CSS) and require trusting some unknown author's javascript to display it.

It's sad how "Security" there's only one very generalizing item. "Implement best practices". Right.

Is the author just ignorant, or am I a fool thinking that if anything it should be "Security" which has the most elaborate items?

Security checklist: https://www.owasp.org/index.php/Category:OWASP_Application_S...

Lowest level 1a has 22 things to verify, highest level 4 has 121 things to verify. That's a lot of checkboxes.

Security is much more dependent on the site itself though, it's not as "general". Do you have forms? Then watch out of SQL injection. Do you have user input of any type? Watch for XSS. Admin login page? Consider HTTPS. Something like a favicon can apply to every site, not so much with security practices. The idea of just having a "security checklist" is a bit worrisome in itself. The developer in charge should be familiar with the potential dangers as they program a feature, it shouldn't be an afterthought from a checklist.

Yes, having robots and favicon is nice, but there are few items on this list that can embarrass / kill a company like bad security.

I would add one: Make sure your log-in form is uncomplicated so that browsers can remember passwords correctly.

So "SEO" has four different checkboxes but "Security" has just one: "Implement best practices"

Uh...I think that can be broken down to at least two different things...

The second being 'cross your fingers'...?

What would be the best approach to automate this so I could put in a URL and it detects as much as it can about the website?

I was thinking about this a couple of days ago. The way I would do it is to submit it individually to each of the checks (e.g. W3C validator) and scrape the results. There may be APIs available for some, I've not looked into that.

Good stuff, I'll probably use this for clients who say, "What have you been doing? It looks done to me!"

This drives me nuts. On the surface there is very little difference between `just functional' and `production ready'. And it's a hard sell if the client is not aware of the benefits.

I have a sublist of that, that I built over the last ~year of hacking on web projects. One of my biggest to dos in each project is automate stuff like validating. I still haven't really found a good way so I either go to w3c and check everything once in a while or I just don't. Usually I just don't.

This to me is like a checklist of things to automate. Is there any "build" system for the web?

Great start. Wish it were set up to be collaborative so we could suggest some of the missing elements.

Perhaps you could fork it (https://github.com/ligershark/webdevchecklist.com) and send a pull request.

Should probably be a shared card on Trello.

Solid functionality that I'll personally use. Good work.

I did something similar - a checklist for prelaunch which you might find some useful things to add to your list: https://bitbucket.org/steerpike/checklist

Nice work, I use this at the moment http://lite.launchlist.net/ as it has more checks and well a prettier interface.

I'd add: check your SSL certificate installation using a tool like this: http://certlogik.com/ssl-checker/

Cool, I haven't seen that one before. I also like this one: https://www.ssllabs.com/ssltest/

I built a public trello board from this list: not quite sure if that's the best presentation (should it be one card for each heading?), but ideally people would clone it to work on their own sites, and make contributions of new cards/info for existing cards on the main board. https://trello.com/b/hkC4B6HA

What do people here think of no-www? I personally hate www and see no reason to unnecessarily increase my url length by 4 characters.

sorry, saw this too late, please see my comment above https://news.ycombinator.com/item?id=5025293

WoW I am building an automated tool right now for exactly that. It is in private beta. Anyone interested in test riding it drop me a line!

http://site-analytics.org/ The intro is already outdated, I'll make a new one very soon.

Thx for sharing. For a newbie in web dev (like me), it is a great resource. Bookmarked.

I have a template project in Basecamp that contains a similar kind of list of tasks that I use to launch all projects.

Just have to create a new project with the template for each launch and then work your way down.

Is it time for feature requests? It'd be great to get integration with common management tools (e.g. Github Issues, Trello), so that the list can automatically be imported for a given milestone.

And people say web development isn't difficult.

Really good for showing customers/bosses/coworkers why a site need those extra hours of love after the proof-of-concept stage.

Security > Cross-site scripting > XSS cheat sheet link is broken. Funny considering the first item in the checklist.

This is cool, I expect a lot of people could get use out of this. The security section is kind of amusing, though.

The spellcheck item didn't have a link. Any good spellchecker bots out there?

You might want to link your href-s to new tab/window.

No. That's what the back button, middle-click, or (cmd/ctrl/whatever it's mapped as) + click are for.

I prefer links away from 'apps' like this site to open in a new window or tab. I'd like the app to preserve its state, not replace itself.

No clean URLs?

How about setting up automated backups?

Clean URLs aren't really necessary and can be difficult with some frameworks.

Automated backups also aren't necessary for all sites, particularly if the entire site is in a source repository somewhere and doesn't have users.

Clean URLs are just as useful and visible as the favicon, or custom error pages, or many other "unnecessary" features.

You need to backup production sites. A repo could do that, but it's just another backup system that needs to be implemented and verified.

Nice list. Missing: HiDPI images.

Big time bookmark. Thanks!

+1 thanks for this!

Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact