
Some web development tips from a former Digg developer - cancan
http://duruk.net/some-web-development-tips/
======
kilburn
> Know why it’s important that your GET requests should always be idempotent.
> For example, your sign-out should only work as a POST request so that
> someone cannot make your users sign out by just including an <img> tag in
> their forum signature.

You got that mixed up. Idempotent requests mean that the result is exactly the
same even if issued multiple times. In the case of a logout, idempotency is
pretty much granted even when using GET requests. The idea here is that GET
request should not change the state of the application, because the browsers
are happy about opening the same URL multiple times without user confirmation.
For instance, a "post/delte/last" URL that deletes your last post would be a
terrible idea, because of the following scenario:

    
    
      1. The user goes to the "post list", */posts*
      2. The user hits "delete last post", and the web sends him to */posts/delete/last*
      3. The user goes somewhere else, */somewhere*
      4. The user decides he would like to go back, and clicks "back". His browser opens */posts/delete/last* without any warning. _Ops!_ he has just deleted another post without even noticing!
    

The <img> URL issue is a separate concern: that of Cross-Site Request Forgery
(CSRF). The easiest way to protect from this security issue is to require a
single-use token for each request that changes the application state. You can
read more about it at the Open Web Application Security Project website:
[https://www.owasp.org/index.php/Cross-
Site_Request_Forgery_%...](https://www.owasp.org/index.php/Cross-
Site_Request_Forgery_%28CSRF%29_Prevention_Cheat_Sheet)

~~~
Androsynth
logout is not idempotent. If you send a log-out request, the system will log
you out. All subsequent attempts will do nothing.

Therefore the results are different for different states and it is not
idempotent.

~~~
mitchellh
I don't think you quite understand the definition of idempotent. A process is
idempotent is if the process can be applied multiple times without changing
the result achieved from the first application.

In this case, logging out multiple times does not change anything from the
first application.

~~~
njharman
you're making way too much assumption as to what logout does behind the
scenes. "without changing result" != "without changing state"

sending notifications, updating counters, etc. all could be result of logging
out.

~~~
kilburn
I think it is easy for us to agree in that, from the client's point of view,
logging out of a website is idempotent.

Now, you got a point about idempotence from the server's point of view.
However, it would take a _badly_ programmed website for the logout operation
to _not_ be idempotent. Sending notifications, updating counter, etc. _without
first checking if the user is really logged in or not_ is simply moronic. This
simple check is what would turn the logout operation into an idempotent one in
the server too.

------
ashray
CSS/JS - Make sure you load these externally so that the browser can actually
cache them.

CDNs - Make sure you add a Cache Control (max-age) header to your CDN sync.
This doesn't happen automatically through most syncing mechanisms. Helps you
save on those pesky HTTP requests that cost $$$.

Gzip - Do not gzip images. It's not worth it. For HTML/JS - YES!

Javascript - If you have ads, definitely load them asynchronously (they go
through multiple servers and take ages..). This is really important as you
want your document.ready to fire asap so that your page is usable.

POST - Always redirect after a post request to prevent reloads causing re-
submits.

Forms - always have a submit button for accessibility.

Usability - Try using your site with a screen reader, don't neglect vision
impaired people. (there are apparently a lot of them!)

data-x attributes will destroy your W3C validator checks. Use them if that's
not important. (sometimes it just is...)

For external scripts that use document.write go take a look at Writecapture.
It's a document.write override which will make your external scripts
asynchronous. (<https://github.com/iamnoah/writeCapture>)

I don't see why counts and pagination are such a big deal. Have done them
correctly multiple times. Faceting might be hard though ;) It's a useful
usability feature to show counts. (or atleast show counts when there is
nothing - i.e. a zero count)

Those are the ones that I could think of right now. :) Great article, some
good points in there!

~~~
rauar
Redirect after POST is not sufficient. Think about users clicking twice
quickly (by accident or intention) before the browser receives the redirect.
CSRF tokens could help if in place, however disabling the trigger until the
redirect arrives is better. Of course this does not solve double-submits using
Ajax.

~~~
FuzzyDunlop
Ajax or no, disabling the submit button/event target is pretty trivial
compared to the complexity of doing the rest of your app. Just disable the
button when it's clicked, or add a disabled class to the link and a separate
click event for it that stops propagation. Or even simpler, just hide it. And
do it before it gets around to sending the request.

~~~
rauar
That's what I tried to say.

Regarding Ajax I just wanted to make clear that it's pointless to wait for the
redirect... even if it's send back as response it would not cause anything
like showing a different page afterwards.

------
mikegirouard
> Trying to load JavaScript dynamically is a good idea but a lot of the time,
> it’s not worth the effort if you can keep your JavaScript to a sensible size
> and load it all at once. This also helps with consequent page visits being
> fast.

As long as we're talking client-side, I couldn't agree more. It seems no
matter how much I try to make things "easier" with YUI Loader or some clever
AMD + loader solution, it always turns out to be a headache.

~~~
mattacular
Agreed. My new system is that all critical JS (eg. anything not related to
ads, tracking, social buttons, etc.) should be loaded all at once with the
rest of the DOM. Then there is a separate async/lazy-load track for that other
crap.

------
heyitsnick
> == is bad. Don’t ever use it.

Could someone expound for the ignorant?

~~~
bcherry
Lazy response, excerpted from a blog post[1]:

One particular weirdness and unpleasantry in JavaScript is the set of equality
operators. Like virtually every language, JavaScript has the standard ==, !=,
<, >, <=, and >= operators. However, == and != are NOT the operators most
would think they are. These operators do type coercion, which is why [0] == 0
and "\n0\t " == 0 both evaluate to true. This is considered, by sane people,
to be a bad thing. Luckily, JavaScript does provide a normal set of equality
operators, which do what you expect: === and !==. It sucks that we need these
at all, and === is a pain to type, but at least [0] !== 0.

[1]: Post from my blog, but I'm not linking to it because the rest is not
useful to answer this question and I don't want to come across as a self-
promoting-link-whore :)

~~~
wingerlang
It is not self promotion if someone asks for it. So how about a link.

~~~
logn
<http://www.adequatelygood.com/2010/3/Performance-of-vs->

------
kenster07
Using minimal javascript - some webapps nowadays have the exact opposite
philosophy, and there are client-side js frameworks which facilitate that

~~~
cancan
yeah, definitely it doesn't apply to everything. part of the reason is that
digg was mostly a read-only site and it had to be fast and it had to work. I'd
not advise the same thing for a real "web app".

~~~
egfx
I couldn't agree more. That's what I was thinking as I was reading it. News
site, stuck, speedy but boring to some degree. The data-x attribute being an
example rung home to take only bits of the advice in my situation.

------
dyscrete
It's surprising a former digg developer describes pagination as hard. Really?

~~~
ralfn
If you would browse 1-100, then 100-200, changes are you are going to see the
same links twice, and miss other links, just because the result set had
changed in between the two requests. And caching a snapshot per user seems a
bit expensive.

Pagination is a mess on Reddit and HN, so maybe he considers pagination "hard
to get right"., since no social news aggregator gets it right.

~~~
da_n
Totally agree, pagination is painfully broken on HN.

------
nixarn
I just tried ImageOptim on some PNG:s. I think it performs poorly :S

~~~
anatoli
Make sure to configure it first, you pretty much want to max out all the
settings (and enable all the different tools) otherwise you will get
underwhelming results.

------
mattacular
"For example, scroll events fire only after the scroll has finished on mobile
browsers"

This is not accurate at all for iOS Safari and Chrome... I just wrote some
scroll-based events earlier this week and they work just fine.

There is some good stuff mixed in here but a lot of it is misleading, poorly
defined, or just flat-out wrong. The most accurate stuff is extremely common
sense like "staging environment should mirror prod" "don't use == (JS)" "don't
use doc write (JS)" etc

~~~
anatoli
No, he's definitely correct. See this Apple article for more information:
[http://developer.apple.com/library/iOS/#documentation/AppleA...](http://developer.apple.com/library/iOS/#documentation/AppleApplications/Reference/SafariWebContent/HandlingEvents/HandlingEvents.html)

------
mattacular
This article should be called: "Some web development tips from a former digg
developer for developing a site EXACTLY LIKE DIGG"

Because most of this stuff is not applicable to webdev in general...

~~~
headShrinker
I found many of the article's points valid for my applications. Not all tips
apply to any ones particular situation but I, like many, am at a point were
issues start popping up and its nice to see a general tool box of tips to pick
from. Like any tips from anywhere "your results may vary. Consult a
professional adviser before acting on any advice."

------
rhizome
Call me snide, but I'm reading this title like, "E-commerce tips from a former
Pets.com marketer."

~~~
jat850
Seems needless, as though it is easy and obvious to discount all technical
knowledge because of an association with a once-popular, now-declined site
that failed for reasons precious few to do with their technology or ability as
developers.

~~~
rhizome
I'm not discounting his knowledge, since Pets.com marketers presumably still
have valid e-com skills.

