

Why does HN timeout so fast? - kaushalc

If I dont use HN for a few mins it expires and clicking on next link shows an error message saying link expired. Why is this? Its a pain that I need to go and reload the main page and the click next to come back to the page i was on.
======
pokoleo
I believe that there was a post about this before.

IIRC, the way that HN is set up, it re-distributes the "fnid=NSgN9i46gR"
parameters at the end of the string to the first/second/third/etc. pages.

Reddit (a similar example) does this, but it also does it in such a way that
the parameters "?count=50&after=t3_ihoin" fit into a SQL(/alternative DB)
select statement.

This would probably be a better way for HN to do it, but they probably have a
reason for why it's currently done this way .

------
wkearney99
Could you at least change the error page to include a link back to the main
'new' page?

------
jones1618
It is brain-damaged that both Reddit and HackerNews do this.

Their next page URL's should either be page based ("show me page 4, even if
page 4 has changed") or based on the next article ("show me the page that
contains article 1234"). You might see a few duplicates are miss a few
articles but you wouldn't get "unknown or expired link" errors.

------
Rust
It does seem to vary a lot between about 2 minutes and 5 minutes of
inactivity. Like I can read or even nicely skim 30 articles in 2 minutes...

~~~
Coincoin
My guess is that it depends on the amount of traffic.

Edit: I remember a few years ago it would take 30 minutes to an hour before it
expired. Maybe it deletes the oldest tokens first.

------
pasbesoin
A while back, when the site was rather bogging down, pg made some changes to
memory management (as I vaguely recall from one of his comments posted at the
time). I believe part of this is more aggressive discarding of memory, and
that part of the result was to succeed, or fail, more quickly without
progressively tying up the server.

So, overall the server works better. When there's load, part of this include
bailing sooner on your troublesome request. ;-)

In your case, this may include no longer having memory/context for stale
pagination links. I deal with this by using a browser extension to load
several pages of links at once. (Without a browser extension, one can achieve
the same results by opening those links manually in new tabs and then working
one's way through those tabs at one's leisure. I'd advise not overdoing this,
though; no need to burden the server with requests you will seldom/never use.)

(I may be wrong as to the memory management. Again, this is just my vague
understanding/recollection.)

~~~
zoudini
What browser extension do you use?

~~~
pasbesoin
Sorry to reply twice, but my previous reply was made quickly and I incorrectly
recalled the wrong context. (I also recently mentioned using an extension to
view PDF's in Google's Document Viewer.)

The extension I was referring to in my grandparent comment is:

<https://addons.mozilla.org/en-US/firefox/addon/repagination/>

(Note that there is another, older "Re-Pagination" extension that was
abandoned. IIRC, this one picked it up and updated it.)

PLEASE BE CAREFUL if you use this extension. If you do not choose a limited
page count, it is easy to unduly load or overload a server. This include the
HN server, and such request behavior will get you cut off or banned (rightly
so; please work to preserve HN resources).

I use it, infrequently, a time or two a day to load the first few top pages of
results, which I then work through at my leisure.

I also use a bookmark to transform all the link href values to open in new
pages/tabs. That way, I don't lose the browser-generated multiple-page page
(as it were) by forgetting to right click or shift-click to open a link that
appears on it in a new tab.

------
MattJ100
Glad you mentioned it, I was beginning to think it was just me.

------
shii
<http://news.ycombinator.com/item?id=17675>

<http://news.ycombinator.com/item?id=2677600>

------
cpayne624
Yeah. It's pretty whack

------
donnaware
It seems ironic that the same people that sit around and obsess about what the
latest groovy language is that you supposed to be using to make your web site
with can not seem to make a very good web site for themselves. But then, I am
addicted to Hacker News, so what does that say about me?

~~~
theoa
And me too...

