
Why does HN generate unique URL's for the 'more' pages? - andrewfelix
I'm not sure if this question has been asked but it doesn't make a whole lot of sense to me. At least a dozen times I've hit the 'more' button to find the URL expired. Wouldn't it make more sense to have a page 2, page 3 etc?
======
DrJokepu
It's called a continuation. Continuations are essentially closures being
passed around to represent state (hence "fnid"), check the Wikipedia page on
them for a proper, computer sciencey definition. The nature of Lisp really
lends itself to continuation-based programming. Sometimes (like on HN) it's
used in web applications. In the context of web applications, it's a different
approach than, for example, REST/MVC. It has a few nice properties, such as
easy state management. This can be very useful when a web application is an
interface to a process (e.g. checkout in a web store) as opposed to CRUDA. On
HN, it allows paging that "remembers" the order of things at the time you
visited the first page. Reddit uses a similar mechanism. The obvious downside
is that you have to keep track of all those states and if you lose them (for
example because they expire), the user workflow is interrupted (hence the
error message).

It's not the trendiest programming technique around these days but it's a
useful tool in the toolkit of a good developer.

~~~
a_a_r_o_n
"The nature of Lisp really lends itself to continuation-based programming."

This comes up occasionally on HN. I've _never_ seen anyone say anything
positive about it.

It seems like a software detail that's unnecessary exposed to users. It
doesn't make the site better, it makes it worse.

~~~
funkah
I completely agree, but I wouldn't mind it nearly so much if the "expired
link" page at the very least had a link to the home page.

~~~
pjob
I added a suggestion to that effect on the Feature Request thread
(<http://news.ycombinator.com/item?id=363>) a while ago, but it slipped off
into the long tail of comments pretty quickly:

<http://news.ycombinator.com/item?id=1480352>

------
mukyu
dchest answered more of the 'what' than why

the answer to the why is news.arc (what this site runs) is the pet example of
Arc, which is pg's lisp dialect focused on making lisp web apps with
continuation passing style

as such, the technical purity to the idea is more important than UX, as
otherwise it would mean admitting that trying to implement pretty simple web
app that passes around continuations like this is a fool's errand

(actually, arc and the ideas work great for rapid prototyping but just do not
work well once you reach more than all but the smallest of scales)

------
SeanLuke
Aigh, you beat me to it.

pg, _please_ fix this. It drives me _batty_ that after a page is open for a
little while _the "more" link breaks_. That should never, ever, ever happen.

------
comex
You can kill me, but I find it somewhat ironic that the functionality
championed in the Arc Challenge is directly responsible for the most annoying
bug on Hacker News. :)

------
satori99
This bugs me endlessly when browsing HN.

I like to open and read interesting links in new tabs, and invariably the
continuations have expired when I return to HN after reading another tab. I
see the expiry error many times everyday. It is a terrible user experience,
even if it's a interesting technical 'solution' to paging.

------
dchest
See this patent: <http://www.freepatentsonline.com/6205469.html>

Also:
[http://www.hnsearch.com/search#request/all&q=expired+lin...](http://www.hnsearch.com/search#request/all&q=expired+link)

~~~
aaronpk
or, for a slightly less advertising-ridden interface,
<http://www.google.com/patents/US6205469>

~~~
daleharvey
irony defined

------
joshuahong100
I'm not sure how the patent relates to the question. The UI issue is that a
user traverses to the 2nd or 3rd page of viewing, go reads some articles or
whatnot, and then comes back to find that he or she cannot move onto the 4th
page.

It's frustrating.

~~~
dchest
Arc server stores closures on server (see the patent for the similar
technique). Links include unique ids of these closures. Closures expire. This
has been explained numerous times here on HN, see the search link in my
different comment.

~~~
unconed
The part that confuses me is that 'this has been discussed before' is
considered a valid answer, as opposed to say 'yes it's horribly broken and no-
one is willing to fix it'. The site routinely fails to return results for
links it generated just minutes before. How that is not considered a bug, on a
site that represents forward thinking web development, mystifies me.

~~~
jadc
Here's pg's explanation (from <http://news.ycombinator.com/item?id=3098756>)
It's not so much that it's ahead of its time relative to hardware as it is
something you do in the early versions of a program. Using closures to store
state on the server is a rapid prototyping technique, like using lists as data
structures. It's elegant but inefficient. In the initial version of HN I used
closures for practically all links. As traffic has increased over the years,
I've gradually replaced them with hard-coded urls. Lately traffic has grown
rapidly (it usually does in the fall) and I've been working on other things
(mostly banning crawlers that don't respect robots.txt), so the rate of
expired links has become more conspicuous. I'll add a few more hard-coded urls
and that will get it down again.

------
scriby
User experience > cute technical solution

~~~
functionform
Continuations were all the rage a few years ago, after yet another doomed
resurgence in web lisp. I agree, it is completely awful, and I'd rather have
semi disoraganized link order.

------
nwatson
I've read the bits about implementation based on continuations in various
other comments.

Given that:

\- the ranking of HN stories likely is in constant flux ...

\- and that some users might want a consistent view of stories across multiple
pages ...

\- and that having .../page1-, .../page2-, .../page3-style URLs might result
in some user seeing the same story presented twice in their browsing, once on
page1 and then again on page2 as its rank changes ... or worse, missing a
story that got promoted to page1 from page2 during a session

... do the continuations as implemented on HN present a static view of the
ranked stories with a non-changing ordering (as long as links don't expire)?
To do this with straight URL's one would need to present different URL links
at different times to different users to get different page-2 contents
reflecting what was on page2 at the time they started their HN visit. If so,
that would be a feature.

~~~
kappaknight
I don't mind seeing the same stories pop up in the feed on the next page as
long as the link works. Let's be real here, these are links to blog posts, and
new projects. It's not like someone's going to get killed if we see a repeated
headline.

------
waterhouse
Btw, for all those who are talking about continuations:

[http://www.hnsearch.com/search#request/all&q=pg+continua...](http://www.hnsearch.com/search#request/all&q=pg+continuation+closure)

~~~
andrewfelix
I didn't even know there was an hnsearch. Thank you!

------
Sander_Marechal
Sounds to me that this could be fixed quite easily. Why not pass both a
continuation ID and a page number on the pagination links? If the continuation
has expired, simply use the page number to generate a new page?

I agree with other commenters that the current way feels very broken. I get to
see the "expired page" dozens of times a day because I always leave my HN tab
open and follow stories in a new tab.

------
SudarshanP
I have a suggestion for a hack that would make life a little more pleasant for
those who do not mind running Javascript while not affecting those who want to
use the site as is. The script can be implemented by the community and hosted
on Git. PG can just include the script in the arc code with just a "one line
change".

The JS will just fetch the initial n pages and allow easy navigation using the
more button. It will be visually very similar to what we have now. If the
community cares it can become much more richer in functionality. If PG
refuses, at least there r enough HN users who are annoyed by this bug that we
can create a Chrome plugin for HN that fixes this. I can probably write
something like this myself. But probably the community can first formulate
excatly what the plugin should/should not do. Does such a plugin already
exist? Is it a bad idea to solve it this way? Would the additional burden on
the servers be too high to be worth it?

~~~
eneveu
As I understand it, pagination is there for performance reasons: the server
does not compute the rest of the page until needed. Adding a script that pre-
fetches all pages would defeat this.

------
quizotic
Patent, shmatent. What currently exists SUCKS. How many users have to complain
before someone gets a glimmer of a clue that MAYBE they should try ANYTHING
ELSE.

------
drivebyacct2
Not only that, but it seems to be paginating comments at an absurdly low
count, I went back to reference an old thread and I went 5 pages in before I
got tired of looking for the right comment. That and I've been bitten by
"expired" links three times today alone, once while I was trying to submit a
comment.

------
ranit8
Off-topic: is HN under heavy load recently? Sometimes I see recent pages cut
off at around 50 comments, with the _more_ links. Normally this would only
happen at older pages where you can no longer comment.

