Hacker News new | comments | show | ask | jobs | submit login
Ask PG: Is HN Expired Link Eventual Fix Planned?
96 points by baldajan on Jan 13, 2014 | hide | past | web | favorite | 118 comments
I believe the expired link issue came from using a new lisp feature, but I'm curious if there is any plans to fix it?

It's a little annoying since I constantly pickup tabs that are a day old, and have to start from the beginning. Hopefully I'm not the only one on HN with that issue.

It's not a single thing. There are multiple places that use dynamically-generated links. They make the source very simple. I've gradually replaced them in the most common situations. Replacing more is not in my top 10 priorities for things to fix. The reason is that it's not really what users want. What people come here for is good stories and comments. I doubt we have ever lost a single user over expired links. Whereas stupid or uncivil comment threads have probably cost us lots of users, and indeed, some of the ones I'd least like to lose. So that's the sort of issue I spend time thinking about.

Incidentally, for those of you working on startups, this is a good example of why making something people want is not a simple matter of making what they say they want.

>>The reason is that it's not really what users want. What people come here for is good stories and comments. I doubt we have ever lost a single user over expired links.

Yes, you probably haven't lost any users over expired links

However, I think you have inconvenienced a lot of users through the expired links. Sometimes, I think it is good to spend a little bit of effort on helping a large number of users.

I'm assuming among readers (who attempt to go past page-1), an overwhelming majority has hit this problem multiple times (and lost time because of this problem)

Amen. What a nuisance. You can keep clicking the next page button to see what other stories there are until you actually read one, after which the next page button will fail, and you'll have to start over again. What's the point of even having more than about three pages? The only way to reach those pages is to keep clicking next while carefully avoiding the temptation to read anything along the way, lest you get sent back to Level 1.

I'm not buying the argument that if HN has good stories and comments people will keep coming back and if they come back, it means they want a next page button that fails. A doesn't imply B.

If frequent expiration is to remain a feature, would it be possible to have a preference setting allowing us to put the first four pages or so on our front page? Then we could read whatever looks most interesting among the first 120 items instead of the first 30 before we get that expired next button we supposedly don't mind.

To avoid that I use this website: http://hckrnews.com/ Maybe you should take a look. Just simple and useful.

And to share everything I also use a theme for HN. It makes the design a little more as a flat design and with bigger letters which is especially what I wanted. https://github.com/gabrielecirulli/hn-special

There is also http://hackerbra.in

but I always use vanilla HN because I'm fond of it and the UI is pretty much the best for me.

That said, it would be very cool if users could implement their own custom CSS natively like you can on http://yayhooray.net

I personally detest browser extensions and I don't like how they slap custom CSS over the top of sites like make-up. Even those who aren't up for fiddling with style sheets could share in what others make.

Thank you to share. For extensions, I do like the good old design of HN but I really prefer to have bigger letters. That's mainly why I use the other one.

Thanks for your suggestion, too, nazka.

I've been a user for 5+ years, for more hours a day than I would care to admit, and have never run into an expired-link problem that couldn't be resolved with a BackButton, copy+paste.

Honestly, it gives HN a bit of charm, and it's never ever really bothered me.

However, I think you have inconvenienced a lot of users...

'Inconvenienced' is broad word. Plenty of people eventually get the hang of driving a motorcycle with the front-brake in the "wrong" spot, and are no longer inconvenienced by that. But it is not truly dangerous or troublesome in the way that having the control pedals in a Car were to be put in the wrong place (eg swap the throttle and brake pedal location). So, in this case its more an inconvenience of having some learning curve; it's not the more dangerous or nefarious inconvenience of something that is truly troublesome, random, or wasteful.

> I think you have inconvenienced a lot of users through the expired links.

I suspect that fixing expired links would improve HN's front-page story quality. I seem to recall patio11 or tptacek once saying if you really want to help out HN, you should sit on the New page and upvote stories. That's too much of a pain for me, but if I could get to pages 2-4 without friction, I'd see more non-front-page stories I could upvote. This would help good stories make it to the front page or stay there longer. So perhaps making things easier for users would also help HN. It may even make it easier for content-rich stories to compete against link-baity ones. What if fixing expired links kept HN from becoming like Reddit?

Fixing the New-page should be much easier than fixing the Front-page. In the New-page you only must remember the last index and the last submission shown, so https://news.ycombinator.com/newest?count=60&id=777777 should work, because they don’t exchange order. The Front-page is much more difficult because you must remember all the shown submissions, because they change of order in time.

Yes, but I'm not saying they should fix the New page. The New page has too many garbage submissions for me to want to spend time there. But if they fixed pages 2-4 behind the front page, I would visit those pages. Those pages have more signal-to-noise than the New page, so I want to read them. And if those pages were getting upvotes, maybe their stories would make it to the front page.

I click more. I find the stories that I am interested in. I open the submission in a new tab and the comments in a new tab. I repeat this until I get to page 4.

I read an article. I upvote it if needed. I read the comments page. I then refresh the page before making any comments. Once i've finished reading and voting and commenting i close the pages and move to the next article.

I use a method similar to the described by DanBC, but I open the comment page in a new tab and I immediately open the submission from that tab, because it’s easier to hit the back button than to remember where each story came from.

Fixing the new page is relatively easy, and it could free more closures to the other pages, so it increase the time that the links live before expiring.

I also sometimes use the “filtered 2+ new page”: http://hnapp.com/filter/d3a308f2ac9a071c0bf174e0c1a8fd22 , i.e. someone else thought that it is interesting. It’s not so painful and it’s also good for days where a single new saturate the front page.

The solution is ctrl-left click (in chrome) so articles open in a new tab in the background, queue up everything you want to read before reading the articles, simple.

I'm impressed at the closure based pagination as a novel way to solve a problem.

Unfortunately, the user experience is quite unpleasant. I hit expired pagination links several times a day. You guys can do better.

I use http://hckrnews.com/ only to avoid this problem.

Same for me! As I said in another comment I use that [1]. Which makes the whole experience perfect for me. (Bigger letters, flat design, and stuff..)

[1] https://github.com/gabrielecirulli/hn-special

same here!

Is there value to maintaining this site yourself? I agree that you shouldn't spend the time to fix this issue yourself, but couldn't a contractor fix it for pocket change, at a low cognitive burden to you? Hasn't this issue been surfaced repeatedly by users, wasting the time of a good many people (including you)? Doesn't it hurt your reputation as a thoughtful and detail oriented organization?

I'm not the only person working on it anymore. For the past year or so we've also had Nick Sivo (kogir), who handles all the systems stuff. And we also have a moderator who is a hacker and has written a bunch of code for identifying voting rings, sockpuppets, and so forth.

I just want to put my hand up against the selection bias in this thread and say that I totally don't care about the broken links, and that rock-solid moderation is a far more important issue for any online community.

I would say that the site should actually be opened up to the whole community to add to it. Hackers may be able to help in ways never imagined before. Some examples:

1. Auto assign upvotes to new stories using machine learning algorithms that use older, already upvoted, stories for learning. This can fix one of the issues often encountered where good stories go ignored because too few people have time to screen new stories.

2. Instead of using ad hoc rules for ordering stories and comments, use machine learning engines again.

Some stories get flagged. These would also provide good data for supervised learning.

I understand some control would need to be exercised to prevent gaming of the system.

The reason is that it's not really what users want

Yeah, actually, it is. Seriously.

Nothing makes "hackers" look worse than a hack that doesn't work.

The true hacker solves the right problem. Expired links are just the visible problem. The real problems are more subtle: users voting up fluff posts, nastiness in comment threads...

If one clicks next to go beyond the first page repeatedly, there is effectively a timeout how much time can be spend until the link expires and one has to start from the beginning.

I think this makes it more tempting to upvote something just after looking at the headline instead of after reading it, because one has to remember so much or use so many tabs or bear with going back to the front page and clicking next again.

I have no idea about everone's behaviour, but maybe expired links in the end also affect something like users voting up fluff posts?

Do you really think you can't solve both problems, or that fixing links hurts your ability to solve bigger problems? It's $500 to a contractor. You can even have someone else find the contractor. This smells fishy.

I suspect the fix will result in a large change to the code base and you want to always own/understand the code for some reason. So if you can't fix it yourself, nobody is allowed to, either. That's fine--just admit it's your baby and you don't want to let it go.

I guess it's also because of the technical elegance of using closurees for page actions. It's hard to admit that a technique that feels elegant and expressive fundamentally doesn't scale.

That's part of being a mature developer. We've all stood stubbornly by a piece of inappropriate code that we want to use because it's so clean and elegant, but most of us have gotten past that stage of ego development by the second or third year of school or professional life. It goes without saying that Paul Graham has long since transcended that level.

So, the only answer that fits all of the parameters is that he's trolling us.

Or that it's not his priority - that's the stated reason.

Not buying it. Priority or not, who leaves a forehead-slapper like that in their code for several years?

You're not far off to assume it hijacks your thinking process and steers it to the unimportant.

The change to the code base would be easy. You can increase the expiration timeout or keep around more closures. It would come at the expense of using more computer hardware, which is an elegant technique of scaling: you free your human hardware to work on more important things.

When a fluff post gets posted some users flag it and some users upvote it.

Is there anyway to penalise the people upvoting a fluff link? Mods have a 'fluff link' flag, and using that applies some karma loss to everyone who upvoted the post and puts a notice on their user page? (This is an ugly kludge. It's meant as a thought experiment rather than a serious implementation idea).

I have made some comments on HN which should have been heavily downvoted but which got no downvotes or even upvotes. I have no idea how this can be fixed.

Some people on HN have an aversion to down voting and flagging and will only use these tools for the most egrarious examples of bad posts. It would be good if people accepted getting and giving downvotes as a useful tool in running the site.

There's another problem with people responding to people who are effectively trolls rather than ignoring or downvoting those posts. I know there are few moderators with very little time. More mods is not the answer, but perhaps 'big downvotes' for those mods would help. Applying a -5 or -10 downvote to people posting aggressive dumb comments would help set tone.

Perhaps having links to all the guidelines (including the links provided to new users) under the submission box would be useful?

My main problem with the "More" button is that it doesn't always give you more. I suggest a text change on the anchor tag to "More, if you're quick about it", and the error message to "You weren't quick enough."

This way I can only blame myself when pressing the button doesn't work as expected.

There is so much more educational value in this conversation than just people saying "serve your users but question what they say they want", as it is right in the action - with all its nuances.

Great lesson on prioritization!

Even if the site had blinking text with glitter background the regulars would still come. Those very regulars click on the pagination and get inconvenienced frequently (also typing and comment and losing it). Since this seems like a highly visible problem and probably the only bug I encounter, I'm a little surprised that it is not in top 10.

Count me in as this being the most annoying issue for me. (Maybe that means you’re doing a lot of things right, but it’s still plenty irritating.) So much so as I have been using sites which mirror or display HN submissions because they don’t have this problem.

Here's a simple thing that has helped me avoid the issue (which otherwise is quite nagging):

Starting from the HN main page, I first open the discussion pages of all the stories I potentially want to read in new tabs (till I start encountering older stories that I have already gone through). This process is fast enough to not encounter expired links.

For each story now, I click the link leading to the story from the HN discussion page, and get back to the discussion page simply using the back button. When done, I simply close the tab.

While I agree with the overall sentiment, it is like optimizing a program. You profile the program and see that foo (lets relate this to the expired link problem) takes 5% of the cpu and bar (the upvoting problem) which takes 50% of the cpu. Naturally, youd want to optimize bar. However, if it takes 50 hours to optimize bar and 1 to optimize foo you're better off going with foo. This too is said with the caveat that 5% better is worthwhile: nobody cares if a 1 second operation is shortened to 950 milliseconds whereas shortened to .5 seconds is something more noteworthy.

Ironically, the only time I really run into this is when reading through bestcomments.

I'm often more interested in interesting discussions rather than the original links and bestcomments is a good way to jump right into these discussions.

It'd be great if bestcomments could me migrated away from the fnid mechanism, but I appreciate this is a side project and this is low priority.

What considerations have been taken into account to dissuade making HN open source? (I know there are some copies of older versions of the source on GitHub) (Arc code could be rewritten in Clojure (don't kill me!)) (I'm assuming there is some concern that transparency would increase ability for spam or 'gaming' the system)

The main reason I haven't published a new version lately is just that I've been too busy working on YC itself. It would take a day of work to strain out the bits of code we don't want to publish, and I don't have any spare days now.

Every comment expressing a preference between lisps should be made with nested parens. Well played.

Why not just integrating HNSearch? That is very convenient for all users.

They are not going to make improvements to Hacker News. They have proven that.

They don't have to fix it or make any improvements. They can run a very poorly performing site with no features and a horrible appearance, because they are entrenched, and they literally own quite a large portion of their userbase.

Unfortunately no matter how many people invent software that does the same thing as Hacker News but works much better and offer it up for free, nothing will be done.

The owner of the site has too much invested in its technology stack (Lisp) and its particular implementation and appearance. The improvements necessary probably require going back on some of those fundamentals, which is never going to happen, because they are too proud.

I upvoted you because I like seeing people who go against the grain here, who hold combative viewpoints and are willing to be blunt about faults, regardless of whether or not I agree.

However, after seeing this:

> ... because they are entrenched, and they literally own quite a large portion of their userbase.

> Unfortunately no matter how many people invent software that does the same thing as Hacker News but works much better and offer it up for free, nothing will be done.

... I have to say, if you are disappointed, have you thought about testing this theory? This website has not been around forever. It isn't the first of its kind either. I'm probably not the only one here who came to this site after having frequented a bunch of others that were essentially the same thing. So I don't expect it to be the last one, either. If someone has the interest and the skill set, I say, please do make this. Worst case you waste a few weekends or something.

There are a number of Hacker News clones, i.e. people got fed up and even went ahead and figured out how to mirror the Hacker News content. Far as I know very few if any gained any popularity, but its not because they weren't better.

I quite enjoy lobste.rs. Despite the lack of discussion, the majority of the topics posted are interesting and turn over pretty fast (in comparison to some of the other replacements).

Thank you for pointing out lobste.rs. I like it. Can you issue me an invite?

Of course, email me at creamapps@gmail.com

I even suspect Hacker News isn't that fussed if most of its community moves elsewhere. The core community (that it was built for) will stay, and the postings will become more focussed on what the core community finds, reads, and is interested in.

Going against the grain ( eg, the prevalent PG / Musk worship) is fine. Feelings of entitlement, less so.

>people who go against the grain here, who hold combative viewpoints and are willing to be blunt

ie assholes

If that's the name for what I described then please call me that. Not only in the fan club but also a member.

as opposed to yes men, it's somewhat of an improvement.

There's also this whole middle-ground of tact and civility and stuff.

There's a place for those things. I also think it's important that you don't get overly consumed by some frivolous notion of those concepts, prioritizing it above honesty, authenticity, and everything else.

To that end I often feel it's refreshing when people say something that is not a majority opinion on this site, and don't hold back.

Seriously, this bug has been around for years and people have constantly complained about it. The only reason they would have for not having fixed it at this point is they simply do not care.

It's dangerous to claim anything is the only reason something happened because it increases the odds you could be wrong.

they simply do not care

If it's negligence that you're implying, consider a more valuable alternative. The expiration link bug is starting to look like the HN test of one's ability to tell what's important to care for. Any hacker would want to develop this ability.

The improvements necessary probably require going back on some of those fundamentals

Not at all. It's pretty trivial to replace code that uses dynamically generated links with code that doesn't.


Can they change the 'lasts' parameter so that things don't expire as quickly?

Another idea.

     To be more sophisticated, instead of killing fnids, could first
    ; replace them with fns that tell the server it's harvesting too
    ; aggressively if they start to get called. But the right thing to
    ; do is estimate what the max no of fnids can be and set the harvest
    ; limit there-- beyond that the only solution is to buy more memory.

    (def harvest-fnids ((o n 50000)) ; was 20000
Can they change that number? Maybe that would help.

Another idea. Could you change it so that when they are harvested the functions are serialized to disk (maybe temporarily), and make fns a function or something so if it had been removed from memory it could be deserialized? It wouldn't need to be a serialization solution that would work for arbitrary closures, just something that would work for those cases, like timed-aform or whatever.

I have barely ever tried to use Lisp and I don't know much about this system so this may not make any sense.

Another idea. Use one of the Hacker News clones that people have built over the years that doesn't have this problem.

If it's that trivial, a lot of us would be happy for such a fix; is there any particular limitation as to why it hasn't been done?

I think the problem is that dynamic links have a lot of other benefits so it doesn't make sense to get rid of them.

> they literally own quite a large portion of their userbase.

Not literally.

That's debatable, but we are talking about a matter of degrees. Wage-based employment is a moderate improvement upon slavery by direct force. The relationship between investors and startup founders is a moderate improvement upon that, but still has quite a few similarities with traditional slavery.

I pointed it out it because I believed it was exaggeration. To me, similarities don't cut it for 'literally'; it has to be identity.

I agree actually literally is the wrong word. Its embarrassing that I used that word incorrectly when I just meant to add emphasis.

Not anymore.

"Why editors are 'literally' changing the dictionary" - http://www.bbc.co.uk/news/uk-23729570

Thanks, I didn't know it was now in the OED.

> Wage-based employment is a moderate improvement upon slavery by direct force.

Someone who was actually a slave would strongly disagree.

Those are some very absolute claims. If you worded that as observations and not so absolutely (or provided some quotes to back up the claims), it would sound a lot more credible.

I don't understand the appearance complaints. Must every site look like a unicorn vomited it out?

I never minded the appearance too much, but the usability bothered me. I month or two ago I tried some of the chrome extensions (specifically HackerNew[1]), and it's made a big difference. In-line replies, hover to see user stats, collapsible threads, etc. I'm not enamored with the infinite scroll, but it's not horrible.

It makes it look a little slicker too, but I truly couldn't care less about that.

[1]: https://chrome.google.com/webstore/detail/hackernew/lgoghlnd...

It's not a good mobile phone experience. It's fixed width <table> based.

It's never really bothered me much on mobile, and I use it extensively while mobile. It does bother me that comment indentation is not properly represented in elinks, but that is sort of an esoteric complaint that I can't really blame PG for.

Its kinda like Craigslist, though not nearly as bad, since Hacker News gets the job done well enough.

(Also, they let people scrape them and create improved versions!)

> Unfortunately no matter how many people invent software that does the same thing as Hacker News but works much better and offer it up for free, nothing will be done.

We should name this - how about "The Craigslist Problem"?

Part of this is that HN is much more than the technology, it's the community. Sure there may be some scaling issues, but building a HN is something many people here could do in an afternoon. Building the community is a much more complicated problem.

They are continuations stored on the server (not a Lisp-specific feature). There is a limit on the number of continuations stored, which is why older ones come up as unknown or expired.

This has been there since Day 1 and will probably never be changed.

See also https://news.ycombinator.com/item?id=163696 https://news.ycombinator.com/item?id=2677469

Closures, not continuations.

Could you give a quick explanation or a link that explains how that is set up? (closures stored on server)

Just curious.

A closure is a function augmented with implicit references to the context in which it was created. In case of an upvote link it would hold references to the post you could upvote. That reference exists not in the URL but in the closure at the servers side. If you keep these objects in memory indefinitely, you'll run out of ram.

Oh, I was assuming some sort of custom implementation for doing that. If it's just hanging around in RAM, fair enough; can't see how that has scaled so well, but hey.

Well, a closure can't be more than a couple hundred bytes, I guess.

If he wants to continue with that technique, I'd hack it like this: use reflection to crawl the closure, isolate the variables bound to the closure context, serialize those (the user id's, post id's, etc). Then the closure code, which will be one of only a few "templates", can be put into a global dictionary. The serialized state goes into the URL parameters. The closure is collected, and "reified" from the global dictionary and URL state once the user performs the action.

I don't know if LISP's reflection can pull this off, but I'm pretty sure it's no biggie.

Visible signs of entropy considered harmful: http://en.m.wikipedia.org/wiki/Broken_windows_theory

Although this is a significant problem, it’s probably better to use the bottom-of-page links:

- “Bugs” (in this case, already seems to be known and talked about; see [1]).

- “Feature Requests” for anything that isn’t a bug.

[1] https://github.com/HackerNews/HN/issues/11

For practical solution: Install "Hacker News Enhancement Suite" Chrome extension, it solves this and godzillion other problems.

I loved this extension, and is really the only thing I'm missing now that I've moved back to Firefox :(

Wow that is so much nicer. Thank you for suggesting this!

No, you are not. It is annoying.

What sort of data structure do you recommend they use to efficiently reconstruct older story orderings?

Well, I would trade off older story ordering for consistency. Or better yet, rather than return Expired Link, return the "current" story ordering.

No one's asking for older story orderings. I'd prefer newer story orderings.

Much like every other paginated site with constantly updating content in existence. Just link to the next page with whatever the current content for that page is. People can figure out that items they're looking for get bumped from one page to another semi-frequently.

You know, like return entire page instead of an arbitrary cut off?

What does "entire page" mean? The front page of HN could be nearly infinite in theory if you include all stories every posted. It has to be cut off at some point.

Most likely parent complains about the fact that if you read long discussion, then it's capped at 100 comments and at the end there is continue link which expires rather quickly.

I think it falls under "Not important enough to fix even though everyone complains about it all the time" -- luckily there exist browser extensions that fix what authors of this site couldn't. (CTRL-F for said extensions)

This site isn't actively developed, and hasn't been, as far as I know, since it was established in 2007.

This is intentional. I forget the specific rationale, however.

So no, to answer your question (though I'm not pg), it likely won't be fixed.

Actually we change things almost daily.

Any recent examples? (Just curious, not trying to debate you)

Right now the top priority is moving to a new server.

I'd guess it'd be under the hood tweaks related to building and maintaining the community (comments and stories)

Can you explain where I got this impression, or am I completely out in left field?

You're probably judging from the site's appearance, which doesn't change much.

I'm assuming the minimalism and hackish nature was deliberately chosen to weed out the lowest common denominator and cater only to the niche HN targets.

That can't be right? Last time I heard they were all about inclusion by being a gateway.

The rationale is that the value provided by fixing the problems is less than the cost of doing the work.

This particular problem isn't going to cause tons of people to leave, and fixing it isn't going to cause tons of people to join.

Unless someone does the work for free, or manages to get a large portion of the current users to quit over the issue, there's no reason to expect it to be changed.

I was about to post about it.

I would definitely like to see this one sorted out.

It is a useful feature, not a bug.

People need to refresh the thread before they respond to ot, especially if it is a day old. This ensures they have an uptodate version of the thread and allows them ro see if anyone has already made their point.

People clicking next to read past the first page is good. Open any comment pages you want to read in new tabs, and refresh those tabs just before you comment.

I strongly agree that there are other problems with HN that need fixing.

From a UI perspective is it too much to ask that instead of showing a FU expired message, just client side redirect the user back to the homepage - should be trivial to implement.

On a desktop it doesn't save much due to bookmarks bar but on mobile it saves a lot of effort on starting over.

expired link issue came from using a new lisp feature

I don't think this is the case.

This fix would have a low priority on my list. I would rather have work done on improving the quality of comments and unfortunate high ranking of middlebrow comments.

One of the worst things about expiring links is that they punish you for taking the time to read articles and long discussions. If I'm just skimming through, there's no problem with expiring links.

The issue at hand never annoyed me too much. It can probably be fixed with a browser extension (I'm using HN Enhancement Suite, maybe contact the author and start a discussion?)

I'd just like it if the "expire links page" said "Expired link, sorry. Here is a link to a fresh top page"

The word "fix" assumes it is a problem. You load a page and after a certain amount of time, the next page is expired.

I am not saying, "it's not a bug, it's a feature!" I'm just saying that's how it works.

It is a problem.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact