

Unknown or expired link. - Andi

Could you solve the "Unknown or expired link." problem on hackernews?
======
redidas
I absolutely can not stand running into that message. By the time I'm finished
browsing the posts and comments on the first page and click the next page
link, I run into that error.

Then I click the back button, thinking that will do the trick, and click the
next page link. Same error, because I never actually got a refreshed page when
I went back.

So then I have to click back again, click refresh, then click the next page,
all to go to the next page.

I understand that it may have been intentional, but it really hurts usability.

I sort of find it contradictory that hacker news itself feels a bit clunky,
but maybe that's the point. Launch when you have something and eventually
you'll know what is needed and what is just an annoyance.

------
primo44
Since news entries are constantly being juggled/reordered, I think it was a
choice that they made. Whenever I get that error I just start on page 1 again.

~~~
Hisoka
I don't get that. Reddit nor Digg has this problem..

~~~
veb
it really is annoying though.

------
dsl
I wonder if you could apply to ycombinator to build a better hackernews...

------
davesmylie
Yup. Super painful.

I put it down to a (perhaps unintentional) * anti-procrastination device on
HN's part.

Eg I'll load up a page worth of links and go read them. If by the time I
finished reading and go back to HN to click on 'Next Page' and I get 'Unknown
or expired link', this means that I've probably spent far too long reading
already and should probably go and do some actual productive work.

* This doesn't always work as intended = /

------
pygorex
This wouldn't be a problem if I could choose the number of articles per page -
30 articles per page isn't enough. I rarely go more than 5 pages deep on HN.
If HN was just a single page with 300 entries that I could refresh (or would
auto-refresh via Comet/AJAX) I would be a happy camper.

------
instakill
It sucks when you have 20 pages to catch up on and you're forced to open all
interesting looking links in new tabs, get to the end before your time runs
out, and only then start reading the new tabs.

------
dchuk
Go back. Refresh the page. Click More.

I think the more important thing to worry about here is getting back to work,
not the pages on a social news aggregator expiring for being open too long.

------
silverbax88
Looks like a choice to keep scrapers from being able to deep dive into the
list of submissions, etc.

~~~
dsl
It's just a broken implementation of pagination.

A scraper would move on to the next page almost immediately, real humans take
a while to get to the next page.

~~~
silverbax88
Yeah, I considered that (and you are right), but if this is an attempt at
pagination, it's a 'WTF' head scratcher as to why you'd make it so
complicated. I'm not saying I know, just giving the coders the benefit of the
doubt.

~~~
millzlane
In any event, an explanation would put me at ease. When I see that I usually
move on to my next list of aggregators.

------
funkah
Or, at the very, very, _very_ least, could that terrible error page contain a
link to the front page?

