
Introducing awfulness.js - michaelfairley
http://blog.tommorris.org/post/21073443312/introducing-awfulness-js
======
orthecreedence
I think pushing the front-end to the browser is a wonderful thing. Now that
browsers have the power to run an entire application inside of them, it makes
sense to distribute that part and worry about scaling your API. Your front-end
will never achieve better concurrency than running 100% on the client :).

As for the features themselves, I agree that slapping things together is a
usability nightmare. This is also new/unexplored territory for the most part
though, and it will take time for usability standards to catch up. Right now,
most people on the bandwagon are just copying facebook/twitter and not
thinking about how this affects their users.

Let's face it, the web is moving in this direction, and while it's great to
sarcastically poke fun at people who are blindly following the masses (I'm all
for this), constructive criticism would be a bit more helpful.

Yes, infinite scroll sucks, but non web nerds seem to really like it. I'd say
implement it for feeds and other time-based data, but not for listing
images/products/etc...or better yet, find a way to implement it via pushState.

Overall a funny post with great points. I don't think anyone is going to throw
away their JS app because of it. I'm currently writing an app that NEEDS to
use pushState because we have a music player that exists across page "loads."
Ok, we could pop open a new window like Myspace does, but talk about usability
annoyance...

~~~
prodigal_erik
The web isn't moving, instead developers are moving back to siloed
client/server apps with private APIs while dismantling the web's semantic
hypertext resources at stable URLs. The result will only be called "the web"
for historical reasons, and it doesn't really matter how smoothly this
disaster unfolds.

------
gooderlooking
Web 1.0 was not as beautiful as you remember. Most websites had blinking text,
bordered tables, under-construction icons and pictures of cats. Around 1998
people started to realize there was more power than just serverside includes
and hypertext links to other URLs (they weren't URIs yet).

The problem was, people had to use creative thinking and a bit of duct tape to
make it all work. There were battles over valid code vs "whatever, it works",
and eventually everyone just sort of accepted that standards won't keep up
with innovation. That's about where we are today (browsers are getting better
at supporting draft-level innovations, which is a hell of an indication of
what they think of this stuff).

Page loads are wasteful, often reloading and redrawing 90% of the same content
just so you can see page 2, and the whole window flashes and spins until its
done.

Anchors are used for compatibility, and when used correctly, are done so for
the very handy fact that the server doesn't need to care about what comes
after the "hashbang".

Yes, bad error handling is bad. But, do you remember spending 20 min filling
out a form only to have the next page after submit show an error, "sorry, the
server could not handle your request", and hitting the back button showed a
nice blank form? That's why we do it differently today.

Pagination is a terrible UI. "<< < 1 2 3 ... 7,600 > >>" is not only useless
_unless_ you want the first or last page, it also takes multiple reload and
redraws to get anywhere in between (try getting to page 3,475, or even trying
to comprehend what is on that page by the time you get there).

Client-heavy apps are actually embracing the beauty of hypertext; they _don't_
require you to download and install new mobile apps, because they work in the
browser you already have.

I think you're romanticizing the simple web and damning web applications based
on the assumption that they are all poorly developed. The relevant argument I
see is that web application developers should pay more attention to the non-
developer, end user experience.

I think we're making good progress with "awfulness.js" sites.

------
wonnage
While it might feel great to shit on the state of client-side JS error
handling, doing so is pretty silly. Most well-written sites give some sort of
indication when your connection flakes out. Gmail is a good example. This
isn't something you can just slap a global spinner together and fix.

~~~
JoshTriplett
True, but every site has to actually think of that case and reimplement the
logic themselves, and they'll almost certainly do a worse job of it than the
browser.

~~~
ebiester
Oh, the "leave it to the browser" crowd made their share of mistakes. Does
anyone remember LiveJournal? Go to the next 20 posts, then back just to make
sure you didn't have one or two lost? If I had a nickel for every poorly-
implemented paging technique I've seen, I'd have enough to pay someone to
write an open source infinite scroll library.

Oh, and how many people remember writing responses in a text editor and
copying it over to the browser window, so that the back button didn't eat your
entire post on a forum when something went wonky with the connection? I don't
remember Twitter or Facebook ever losing a submission. Maybe I don't use them
enough, but asynchronous submit is a great step forward, and I prefer infinite
scroll to many of the paging implementations I have lived with over the years.

~~~
tommorris
It's not so much about which is better: it's about the fact that we need
consistency. At least with "leave it to the browser", there is some
consistency and users can learn to live with that consistency (e.g. by writing
longer posts in a text editor and copy-pasting).

When everyone does it ever so slightly differently, you are lulled into a
false sense of security.

I don't mind software sucking, so long as it sucks in a generally consistent
way.

------
andybak
Can I just add that on the whole the techniques that currently get labelled
'responsive' are fairly non-awful and only obliquely related to the awfulness
described herein.

------
drivebyacct2
Infinite scroll needs to die a fire. The devs that write infinite scrolling
that also lacks jumping ability, page linking or "skip to end", deserve to be
set ablaze in the same fire. What a nightmare. I feel like even rudimentary
testing with insufficient dummy data would reveal massive UX regressions.

And why oh why are people still writing Javascript that uses hashbangs?
Educate yourself, use pushState.

(And the new Google Groups is arguably the worst of Google's redesigns. It
takes a measurable 2-3 seconds when hitting the [<-] arrow to go back from a
thread to the group, not to mention it's the worst examples of both of these
issuses).

~~~
wavephorm
No, infinite scrolling can work. The problem is Facebook and other sites
aren't doing it right. As you scroll the URL should update with corresponding
hash tags for how far down the page you went (eg /#20 /#40 /#50). Then if you
go forward and back the correct block can be loaded. Facebook is the best
example of a poorly implemented infinite scroll.

~~~
callahad
I still don't think that's quite right, since you're presumably numbering the
newest item as "0" and then counting up from there. Which means after 20 new
posts come in, my link to /#20 no longer takes me where I want it to.

Similarly, with history.pushState and history.replaceState, why use hashtags
instead of actual URLs?

~~~
mudetroit
I have been thinking about what the url space for infinite scrolling should
look like, and I think that if you have meaningful titles you should probably
use those as your indicators instead of numbers. Still having paging links
makes sense to jump groups of posts.

As for the hashtag, it is a worthwhile thought and you could probably go
either way on that one. If you think of the original purpose of the hashtag it
was to jump to a specific anchor on the page.

------
Tangaroa
On the subject of cryptic loading signals and not being clear whether a page
is communicating with the server or not, most spinners are wrong.
<http://tangaroa.dreamwidth.org/11114.html>

