Hackernews seems to generate a token for the next page that has a limited cache/lifetime.
Is this a common design pattern? Where can I find more information.
It may be overkill to use this approach if you just want to generate the frontpage, but the advantage is that it's very general. You're not limited to displaying a range of items stored in a list somewhere; you could be displaying things you're computing on the fly.
Is the stored closure working on a snapshot of potential articles or does it have a state-less algorithm that can generate next n articles?
People complain often about traditional offset/limit pagination, but this the most interesting alternative I've heard.
I haven't heard any. What kinds of complaints have you heard?
I appreciate the generality and elegance of the closure approach, but the links expire far too quickly for my taste. About once a week I read a page slowly enough that the "next" link is dead by the time I click it.
You cache Page 1 with some set of objects, by the time you cache Page 2 you have Page 2 of a different overall set. It's inconsistent and wonky.
I was hoping there was a more clever way to tease out an elegant solution here.
For pagination, this is a very minor inconvenience. However for topic threads such as this one, expiring links is irritating.
When you removed or upped the limit on how far back one can look (this was done a year or a bit more ago?), that change was made to news/ and newest/, but not to classic/ (still cuts off at 210 items). Any chance of also lifting/upping the limit on classic/ ? Once in a while, I like to dig back a bit through it.
(If not, hope you don't mind my asking.)